Artificial Neural Networks (ANN) is the most learning algorithm that used widely today in many fields such as Computer Vision, Voice recognitions, Natural Language Processing, Autonomous Driving, etc. Big companies such as Google with their giant Search Engine, Amazon with their Recommender System, Facebook with their Automated Tag or Apple with their Siri Application, that really amazing. In decade, the powerful hardware improvement allows us to design and train complicated ANN efficiently with many layers and billion of neurons (Deep learning) to solve many complex problems in real world. Although it's very complicated today, understanding the basic concept of ANN and how it works is very important for who want to research deeper about another ANN architectures such as Convolutional Neural Networks (CNNs), Recurrent Neural Network(RNNs) or apply ANN successfully.
2. History and Overview about Artificial Neural Network
3. Single neural network
6. Summary3. Single neural network
- 3.1 Perceptron
- 3.1.1 The Unit Step function
- 3.1.2 The Perceptron rules
- 3.1.3 The bias term
- 3.1.4 Implement Perceptron in Python
- 3.2 Adaptive Linear Neurons
- 3.2.1 Gradient Descent rule (Delta rule)
- 3.2.2 Learning rate in Gradient Descent
- 3.2.3 Implement Adaline in Python to classify Iris data
- 3.2.4 Learning via types of Gradient Descent
- 3.3 Problems with Perceptron (AI Winter)
- 4.1 Overview about Multi-layer Neural Network
- 4.2 Forward Propagation
- 4.3 Cost function
- 4.4 Backpropagation
- 4.5 Implement simple Multi-layer Neural Network to solve the problem of Perceptron
- 4.6 Some optional techniques for Multi-layer Neural Network Optimization
- 4.7 Multi-layer Neural Network for binary/multi classification
- 5.1 Overview about MNIST data
- 5.2 Implement Multi-layer Neural Network
- 5.3 Debugging Neural Network with Gradient Descent Checking
Congratulation! We went through many kinds of stuff about Neural Networks from it was born to nowaday, let's review several main points. Alright, we start from simple Neural Network call Perceptron and Adaline, although it just can solve the linear data it exposes some background knowledge such as minimize the cost function, backpropagation that still is used in today. Then, we trapped in the problem of non-linear data that cause the AI Winter. About 20 years later, AI renewed with Multi-layer Neural Network with more than two layers could solve this problem, but it seems to be not optimization can lead to computationally expensive, local minimal, overfitting, gradient vanishing, etc. We propose some regular techniques so-called regularization, momentum term or adaptive learning to help our model work better. Finally, we want to make sure our Network run well and make higher confidence about our code, then gradient descent checking came to help us.
Although we just learned the basic knowledge about Neural Net and there are so many advanced techniques out there are considered to be more accurate, higher performance, higher optimization than regular technique we learned, sometimes it's maybe quite challenging and does not make sense. But don't worry, learning is Marathon, not a sprint. If you're fully understanding, cheer for you, because you're ready to go out there and pick some advanced kinds of stuff, understand it without difficult or take much time to meditate.
7. References
[1] Trevor Hastie, Robert Tibshirani, Jerome Friedman. The Elements of Statistical Learning: "S-3.4: Shrinkage Methods"
[2] Sebastian Raschka. Python Machine Learning. "C-12: Training Artificial Neural Networks for Image Recognition"
[3] Christopher M. Bishop. Pattern Recognition and Machine Learning. "S-5.3: Error Backpropagation"
[4] Andrew Ng. Machine Learning Course. "Neural Network: Learning, Gradient Descent Checking"
[5] Yaser Abu-Mostafa. Learning From Data. "L-12: Regularization"
8. Future work
Understanding Artificial Neural Network from scratch is the ground step prepare for another Neural Network architectures such as Convolutional Neural Network (CNN), Recurrent Neural Network (RNN), etc. Multi-layer Neural Network has lots of improvement itself with so many complicated optimization techniques we can now applying in the real world application, so in the future, we'll continue the field of Deep Learning with Convolutional Neural Network, then we have lots of loved workload to do.
9. Available source code at here
9. Available source code at here
No comments :
Post a Comment
Leave a Comment...