Wednesday, August 1, 2018

Artifical Neural Network

Artificial Neural Network (ANN)

Aim:
Trying to mimic human brain with machines/computers

Neuron
Activation Function
How Neural Networks works (example)
How Neural Networks Learn
Gradient Descent
Stochastic Gradient Descent
Back Propagation

Neuron

Basics of brain cells
Neuron will not work in single
Group of neurons will work (eg: ant)
Neuron Parts
                Axon – tail
                Neuron – head – contains central nucleus
                Dendrites- the branches
                Synapse- each axon is connected with other dendrites using synapse (not physically)
Electrical impulses are the source or input to brain /neuron

                Fig:1.1 Neuron



Fig 1.2- Neuron – Schematic Model

Inputs – senses or impulses represented in yellow (x1, x2, x3)
All inputs are independent variables.
Weights- The quantifying value given to estimate the impact of the input on decision (prediction) (w1, w2)
Synapse- Blue Lines connecting the x1 to neuron
Neuron – Center Green circle is called as neuron
Activation Function - the summation of inputs *weights (response of the neuron to take decision)
Output –(y) – the Predicted result      

Activation Function:
4 types of outputs
Fig 2.1 Threshold Function

Threshold Function – basically true or false type functions
Values of x are 0 till some threshold value after which it responds to 1 and stays there

Sigmoid Function = Probability of Y being True or False.
Values lie between 0 and 1
Fig 2.2 Sigmoid Function

Rectifier
Values of Y will be 0 for some threshold value and then increases linearly
Fig 2.3 Rectifier

                                                                            

                                                                                                                                                                               
Fig 2.4 Hyperbolic Tangent

Fig 2.5 Overall Neural Network


Hidden layer – always applies Rectifier function
Out Put layer – then decides which activation function to be used





How ANN works

Below case study is to predict the price of a house (Y)
·         The independent variables /factors affecting the price of a house are the Area, Bedrooms, Age, and Distance from city (x1, x2, x3, and x4)
o   Age of property increases, then historical building hence the price increases as rectifier function as shown in the hidden layer
·         Weights are the values for factors (Age can be major factor in determining the price of a house so weighed more)
·         All input may affect one neuron in hidden layer or only one input (Age) can affect one neuron.
·         The hidden layer will think or consider all the probabilities based on the data that is provided as input
Fig 3.1 – Practical example of input and out parameters


How ANN learns:

Ycap= output value
Y = real value
Error = Ycap- Y ( for each rows and error is again given as feedback to model and thus error reduces)
C=1/2(Ycap-Y)2  - Called as cost function.
Based on the Cost function – the weights (W1,W2 ) values will be adjusted for each bacth of runs.






Fig 4.1  Cost Function


Fig 4.2 Back Propagation Error Feedback











Fig4.3 Back Propagation
Gradient
Slope error.
Slope at Y axis:
If  negative slope then  it is downhill. We have to adjust the weights accordingly





Optimization – the high hill
Bottom most point is optimal weight.










Gradient Descent is the slope or correction of Ycap – Y for a batch run.
               

Back Propgation: all the weights are adjusted at once in back propogation. The neural networks adjusts all weights as it has data and knows exactly which neuron causes error and corrects it.













Fig 5.1 Stochastic Gradient Descent:
Every run the error is found and weights are adjusted ( w1, w2.. )



General Steps: