Skip to content

Applying neural network with adam optimizer on heart failure clinical records dataset to compare test errors of sigmoid, tanh, and relu activation functions

Notifications You must be signed in to change notification settings

djm160830/nn-enhanced

Repository files navigation

train/test ratio: 80% training, 20% testing

How to run

Install requirements

pip install -r requirements.txt
python nn-enhanced.py

Expected outputs (sample output):

ACTIVATION: sigmoid
Total Test error with sigmoid activation:                             8.414172716246473
ACTIVATION: tanh

After 60000 iterations, total error with tanh activation function: 3.0528201232835785
ACTIVATION: tanh
Total Test error with tanh activation:                             7.406216006829898
ACTIVATION: relu

After 60000 iterations, total error with relu activation function: 8.424554943130456
ACTIVATION: relu
Total Test error with relu activation:                             7.2460576025569505

About

Applying neural network with adam optimizer on heart failure clinical records dataset to compare test errors of sigmoid, tanh, and relu activation functions

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages