Skip to content

Leveraging Continuously Differentiable Activation for Learning in Analog and Quantized Noisy Environments

License

Notifications You must be signed in to change notification settings

Vivswan/GeLUReLUInterpolation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

40 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GeLUReLUInterpolation

This is the official repository for the paper:
Leveraging Continuously Differentiable Activation for Learning in Analog and Quantized Noisy Environments

Requirements

The following packages are required to run the simulation:

Run the simulation

Datasets

  • CIFAR-10 and CIFAR-100 datasets are used in the experiments. The datasets are automatically downloaded by the PyTorch library.

Models

  • ConvNet: Model with 6 convolutional layers and 3 fully connected layers. Run this model using src/run_conv.py script.
  • ResNet: Run this model using src/run_resnet.py script.
  • VGG: Run this model using src/run_vgg.py script.
  • ViT: Run this model using src/run_vit.py script.

Cite

We would appreciate if you cite the following paper in your publications if you find this code useful:

@article{shah2024leveraging,
  title={Leveraging Continuously Differentiable Activation Functions for Learning in Quantized Noisy Environments},
  author={Shah, Vivswan and Youngblood, Nathan},
  journal={arXiv preprint arXiv:2402.02593},
  url = {http://arxiv.org/abs/2402.02593},
  doi = {10.48550/arXiv.2402.02593},
  year={2024}
}

Or in textual form:

Shah, Vivswan, and Nathan Youngblood. "Leveraging Continuously Differentiable Activation
Functions for Learning in Quantized Noisy Environments." arXiv preprint arXiv:2402.02593 (2024).

About

Leveraging Continuously Differentiable Activation for Learning in Analog and Quantized Noisy Environments

Resources

License

Stars

Watchers

Forks

Languages