Skip to content

In this project I have successfully developed a 2 layer neural network without using any major libraries like Keras or Tensorflow by purely using mathematical calculations to implement gradient descent and back propagation algorithms along with feed forward network

License

Notifications You must be signed in to change notification settings

saranshmanu/2-Layer-Neural-Network-from-scratch

Repository files navigation

Artificial Neural Network

In this project I have successfully developed a 2 layer neural network without using any major libraries like Keras or Tensorflow by purely using mathematical calculations to implement gradient descent and back propagation along with feed forward architecture

If you have been a developer or seen one work – you know how it is to search for bugs in a code. You would fire various test cases by varying the inputs or circumstances and look for the output. The change in output provides you a hint on where to look for the bug – which module to check, which lines to read. Once you find it, you make the changes and the exercise continues until you have the right code / application.

Neural networks work in very similar manner. It takes several input, processes it through multiple neurons from multiple hidden layers and returns the result using an output layer. This result estimation process is technically known as “Forward Propagation“.

Next, we compare the result with actual output. The task is to make the output to neural network as close to actual (desired) output. Each of these neurons are contributing some error to final output. How do you reduce the error?

We try to minimize the value/ weight of neurons those are contributing more to the error and this happens while traveling back to the neurons of the neural network and finding where the error lies. This process is known as “Backward Propagation“.

In order to reduce these number of iterations to minimize the error, the neural networks use a common algorithm known as “Gradient Descent”, which helps to optimize the task quickly and efficiently.

That’s it – this is how Neural network works! I know this is a very simple representation, but it would help you understand things in a simple manner.

About

In this project I have successfully developed a 2 layer neural network without using any major libraries like Keras or Tensorflow by purely using mathematical calculations to implement gradient descent and back propagation algorithms along with feed forward network

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published