Message Boards Message Boards

MNIST Classifier with Multi-layer Perceptron, Deep Leaning with OOP

Posted 7 years ago

This Project is intended to implement a straightforward perceptron classifier to show the programming advantages of Mathematica which introduces a simple code and give a good perspective for the deep leaning concept. As a result of simple implementation, we can evaluate our ideas and observe the effect of parameter variance.

In this case, the perceptron is constructed with three layers, the first input layer, second hidden layer, and the last output layer. Each layer has adequate number of nodes. The OOP, Object Oriented Paradigm for Mathematica (https://www.slideshare.net/kobayashikorio/oop-for-mathematica) is the key idea applied to the implementation for a perceptron kernel that is represented as the cluster of nodes, those are class derived instances, connected each others. MNIST classifier is one of the targets to evaluate this Mathematica OOP perceptron.

However the code computation speed is not enough comparing to the other faster implementations such as matrix style calculation, the developed sample worked well and gave output of appropriate classification for MNIST dataset.

2017/5/3 modified attached source file to match Mathematica ver.10. 2017/5/18 modified attached source file showing filter effects.

Attachments:
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard

Group Abstract Group Abstract