TrisZaska's Machine Learning Blog

Implement Multi-layer Neural Network by Python

1. Introduction
2. History and Overview about Artificial Neural Network
3. Single neural network
4. Multi-layer neural network
5. Install and using Multi-layer Neural Network to classify MNIST data
6. Summary
7. References

Implement Multi-layer Neural Network

It's quite similar when we installed Neural Network to solve the Problem of Perceptron, but in this exercise, we'll build stronger Multi-layer Neural Network to deal with real large data using all of the technique we learned, let's start.
### Import several needed libraries ### Install MLP Classifier and some needed methods Alright, we've done install Neural Network, let's use prepared data to train the model. We use epochs=800 and minibatches=50. We also choose small learning rate eta, while adjusting the momentum alpha bigger and choose L2 Regularization for weight decay purpose. ### Here the result on my own computer
### And the error curve after traning
As you can see, the error curve is very noisy, right? Because we training our model using Mini-batch Gradient Descent. Do you remember? The properties of Mini-batch Gradient Descent is training on every mini-part of data instead of whole dataset one time (epoch), that lead to the above result. We're not showing here, but for better visualization with less noisy, you can calculate the mean after every epoch and plot it. Now, let's take a look at the accuracy in training and test samples.



### Print out the accuracy on test dataset



It seems our model work well on the training dataset rather than test dataset. So, it can make you imagine something we learned before, hm? Is it quite overfitting, isn't it? Alright, there are many useful techniques out there can help you tune the best parameters to achieves the best model for both training and testing, you maybe could find and check it for yourself, it's very fun. Now let's take a look at some numbers we classify correctly
### And some images our model classified incorrectly,

No comments :

Post a Comment

Leave a Comment...