Skip to content

In this repo, the backpropagation algorithm in feedforward neural networks is implemented from scratch using C.

Notifications You must be signed in to change notification settings

sm823zw/Neural-Network-Backprop-using-C

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 

Repository files navigation

Neural-Network-using-C

In this project, the backpropagation algorithm in feedforward neural networks is implemented using C. A data structure to store and manipulate neural network weights is created. Algorithms are designed to perform forward propagation, back propagation, and update weights. The user can specify the number of hidden layers, number of neurons in each hidden layer, the loss function, the optimizer, and the activation function to be used. Backpropagation is an algorithm used for updating weights of an artificial neural network using gradient descent optimization. Given a neural network and a loss function, the algorithm calculates the gradient of the error function with respect to each of the neural network’s weights. The calculation of errors takes place in a backward direction from the output layer, through the hidden layers to the input layer. Using the obtained errors, the weights are updated. For demonstration, a neural network model was trained and tested on the MNIST dataset for handwritten digit recognition.

About

In this repo, the backpropagation algorithm in feedforward neural networks is implemented from scratch using C.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages