An investigation into weight importance measures in neural networks, relating to sequential learning and interpretability.
-
Updated
Feb 9, 2023 - Jupyter Notebook
An investigation into weight importance measures in neural networks, relating to sequential learning and interpretability.
An investigation into sequential learning of tasks using feed-forward networks built with Tensorflow
CEL Continual Learning
Federated Echo State Networks for Stress Prediction in the Automotive Use Case. Master Thesis in Artificial Intelligence @ University of Pisa
Multi domain adaption of quick sentiment analysis on mutliple catagories of task like classification of the nature of the reviews regrading various objects found in Amazon website
comparative evaluation of incremental machine learning methods
This is the temporary version of the MINT Lab continual-learning website.
Tensorflow 1.x implementation of EWC, evaluated on permuted MNIST
#WORK IN PROGRESS PyTorch Implementation of Supervised and Deep Q-Learning EWC(Elastic Weight Consolidation), introduced in "Overcoming Catastrophic Forgetting in Neural Networks"
PyTorch implementation of a VAE-based generative classifier, as well as other class-incremental learning methods that do not store data (DGR, BI-R, EWC, SI, CWR, CWR+, AR1, the "labels trick", SLDA).
Elastic weight consolidation technique for incremental learning.
Continual Hyperparameter Selection Framework. Compares 11 state-of-the-art Lifelong Learning methods and 4 baselines. Official Codebase of "A continual learning survey: Defying forgetting in classification tasks." in IEEE TPAMI.
Continual learning baselines and strategies from popular papers, using Avalanche. We include EWC, SI, GEM, AGEM, LwF, iCarl, GDumb, and other strategies.
A brain-inspired version of generative replay for continual learning with deep neural networks (e.g., class-incremental learning on CIFAR-100; PyTorch code).
PyTorch implementation of various methods for continual learning (XdG, EWC, SI, LwF, FROMP, DGR, BI-R, ER, A-GEM, iCaRL, Generative Classifier) in three different scenarios.
Add a description, image, and links to the elastic-weight-consolidation topic page so that developers can more easily learn about it.
To associate your repository with the elastic-weight-consolidation topic, visit your repo's landing page and select "manage topics."