1. Introduction
2. History and Overview about Artificial Neural Network
3. Single neural network
7. References
2. History and Overview about Artificial Neural Network
3. Single neural network
- 3.1 Perceptron
- 3.1.1 The Unit Step function
- 3.1.2 The Perceptron rules
- 3.1.3 The bias term
- 3.1.4 Implement Perceptron in Python
- 3.2 Adaptive Linear Neurons
- 3.2.1 Gradient Descent rule (Delta rule)
- 3.2.2 Learning rate in Gradient Descent
- 3.2.3 Implement Adaline in Python to classify Iris data
- 3.2.4 Learning via types of Gradient Descent
- 3.3 Problems with Perceptron (AI Winter)
- 4.1 Overview about Multi-layer Neural Network
- 4.2 Forward Propagation
- 4.3 Cost function
- 4.4 Backpropagation
- 4.5 Implement simple Multi-layer Neural Network to solve the problem of Perceptron
- 4.6 Some optional techniques for Multi-layer Neural Network Optimization
- 4.7 Multi-layer Neural Network for binary/multi classification
- 5.1 Overview about MNIST data
- 5.2 Implement Multi-layer Neural Network
- 5.3 Debugging Neural Network with Gradient Descent Checking
7. References
The bias term
Look at the pictures of Perceptron and the equation (1) where we have extra w0x0. It's call the bias, so what is it and why we definitely need it?Firstly, it's constant value with x0 always equal 1 and w0 similar to another weight in Perceptron and we can also update w0, too. So, what is the role of the Bias in Neural Network?
The bias allow we to shift the Unit step function to the left or right to achieve successful learning. Usually in Perceptron, we set x0=1 and w0=−θ, so look at the images below to have an intuition about it
So, why we want to shift the Unit step function? Let's consider when we have all of the input equal to 0, but we want our Perceptron's output is 1. Adjust the weight is hopeless because z=wTx always equal to 0. Then, the bias will come to solve this situation. That's why we definitely needed and the role of the bias in Multi-layer Neural Network is trying to shift the Sigmoid function instead of the Unit step function.
No comments :
Post a Comment
Leave a Comment...