Small Network Working

I started modelling a 3-layer neural network under the mistaken assumption that there would always be the same number of neurons in each layer.  It took me a little longer to write and debug a more general setup. The one shown below has 10, 7 and 4 neurons in the input, hidden and output layers respectively.  This is intended as an intermediate step towards the over 700 input neurons used for symbol recognition in the book by Tariq Rashid.  The network, as shown below, was used for a trivial (but easily checkable) example of numerical transformation.  The idea was that only one of the 10 inputs would be ‘high’ at any time, and that this would represent a number from 0 to 9.  The training requirement was that the 4 outputs would represent collectively the binary equivalent of the same number.  In the example shown here, ‘5’ is represented as a ‘1’ on input neuron 6, and the expected outputs were 0,1,0,1 – so that 0101 is then the binary equivalent  of 5. Obviously, this process does not require a trained network, as the result is easily predictable analytically, but it was a useful exercise to train the trainer!

The graphs and texts shown below use the example of ‘5’ being the input, and show the training sequence as the data sloshes forwards and backwards until the outputs are within the required tolerance, which was chosen as 0.01 in this case.

I ought to mention that I used the Tanh(x) function rather than the Sigmoid(x) function as I prefer that 0 inputs give rise to zero outputs.

Any constructive comments would be welcome.

NetTest.jpgsm5graph.jpg5test.jpg

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s