ML Week4 - Neural Network

October 24, 2017

ML AI
Brain auditory cortex somatosensory cortex NN Model Neuro.IO Dentrite => Axon input layer > hidden layer (intermediate layer) > output layer all hidden layer nodes are called “activation units” $$a_i^{(j)} = \text{“activation” of unit $i$ in layer $j$}$$ $$\Theta^{(j)} = \text{matrix of weights controlling function mapping from layer $j$ to layer $j+1$}$$ Forward Propagation using trained parameters $\theta_{n}$ we can predict the output by calculate the output layer value

ML Week2

October 8, 2017

ML AI
Multivariate Linear Regression Hypothesis $$ h_\theta(x) = \begin{bmatrix}\theta_0 \hspace{2em} \theta_1 \hspace{2em} … \hspace{2em} \theta_n \end{bmatrix} \begin{bmatrix}x_0 \newline x_1 \newline \vdots \newline x_n\end{bmatrix} = \theta^T x $$ Gradient Descent Practices Scaling or normalize the range (e.g. mean normalization) learning rate $\alpha$ Ploynomial Regression typical equations: $\theta_0x_0 + \theta_1x_1 + \theta_2x_2^2$ $\theta_0x_0 + \theta_1x_1 + \theta_2x_2^2 + \theta_3x_3^3$ $\theta_0x_0 + \theta_1x_1 + \theta_2 \sqrt{x_2}$ Normal Equation if n > 1000, hard to solve the inverse of $(X^TX)^{-1}$ ... Read more

ML Week1

October 6, 2017

ML AI
Andrew Ng’s ML Course Note Diagram Supervised Learning In supervised learning, we are given a data set and already know what our correct output should look like, having the idea that there is a relationship between the input and the output. Supervised learning problems are categorized into “regression” and “classification” problems. In a regression problem, we are trying to predict results within a continuous output, meaning that we are trying ... Read more

© 2018 | 朱曉清 | powered by Hugo