Skip to content

HaoranChen/Adaptive-Retention-and-Correction-for-Continual-Learning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ICLR 2025 - Adaptive Retention & Correction: Test-Time Training for Continual Learning

📑Paper Link Authors: Haoran Chen, Micah Goldblum, Zuxuan Wu, Yu-Gang Jiang

How to use

Dependencies

  1. torch 2.0.1
  2. torchvision 0.15.2
  3. timm 0.6.12 (Note, for reproducing ARC + DER, we recommend using timm 0.5.4)
  4. tqdm
  5. numpy
  6. scipy

Run experiment

  1. Edit the [MODEL NAME].json file for global settings and hyperparameters.

  2. Run:

    python main.py --config=./exps/[MODEL NAME].json

Implementation details

  1. We primarily modified the _eval_cnn function in the files within the models directory compared to the PILOT version.
  2. Compared to the original conference version, we have re-implemented the framework. In the previous version, the inference process was fixed to a batch size of 1. In the new implementation, we introduce the arc_batch_size hyperparameter, which allows the inference batch size to be adjusted. Empirically, we observe that increasing this parameter tends to reduce overall accuracy while improving inference speed.
  3. The current version achieves substantially faster inference while maintaining comparable, and in some cases better, performance compared to the original version.

Acknowledgments

We thank the PILOT repo for providing helpful codebase in our work.

Contact

Feel free to contact us if you have any questions or suggestions Email: [email protected]

Citation

If you use our code in this repo or find our work helpful, please consider giving a citation:

@inproceedings{chenarc,
  title={Adaptive Retention \& Correction: Test-Time Training for Continual Learning},
  author={Chen, Haoran and Goldblum, Micah and Wu, Zuxuan and Jiang, Yu-Gang},
  booktitle={ICLR 2025},
  year={2025}
}

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages