Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(optim): Adadelta RAdam Adamax optimizer support #171

Merged
merged 39 commits into from
Jul 22, 2023

Conversation

JieRen98
Copy link
Collaborator

@JieRen98 JieRen98 commented Jun 18, 2023

Roadmap

  • low-level alias
  • high-level Optimizer
  • high-level MetaOptimizer
  • low-level test
  • high-level Optimizer test
  • CHANGELOG
  • import test
  • api docs

@codecov
Copy link

codecov bot commented Jun 19, 2023

Codecov Report

Patch coverage: 96.58% and project coverage change: +0.28 🎉

Comparison is base (3f68378) 93.37% compared to head (3e90e72) 93.65%.

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #171      +/-   ##
==========================================
+ Coverage   93.37%   93.65%   +0.28%     
==========================================
  Files          71       83      +12     
  Lines        2683     2943     +260     
==========================================
+ Hits         2505     2756     +251     
- Misses        178      187       +9     
Flag Coverage Δ
unittests 93.65% <96.58%> (+0.28%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
torchopt/alias/adagrad.py 100.00% <ø> (ø)
torchopt/optim/adagrad.py 100.00% <ø> (ø)
torchopt/optim/func/base.py 88.46% <ø> (ø)
torchopt/optim/meta/adagrad.py 90.91% <ø> (ø)
torchopt/transform/scale_by_radam.py 88.00% <88.00%> (ø)
torchopt/optim/meta/radam.py 90.00% <90.00%> (ø)
torchopt/optim/meta/adadelta.py 90.91% <90.91%> (ø)
torchopt/optim/meta/adamax.py 90.91% <90.91%> (ø)
torchopt/__init__.py 100.00% <100.00%> (ø)
torchopt/alias/__init__.py 100.00% <100.00%> (ø)
... and 11 more

☔ View full report in Codecov by Sentry.
📢 Do you have feedback about the report comment? Let us know in this issue.

@Benjamin-eecs Benjamin-eecs self-requested a review June 19, 2023 14:24
@JieRen98 JieRen98 requested review from XuehaiPan and Benjamin-eecs and removed request for Benjamin-eecs June 19, 2023 14:27
@Benjamin-eecs Benjamin-eecs changed the title feat(optim): support adadelta feat(optim): adadelta optimizer support Jul 1, 2023
@XuehaiPan XuehaiPan changed the title feat(optim): adadelta optimizer support feat(optim): AdaDelta optimizer support Jul 1, 2023
@XuehaiPan XuehaiPan added enhancement New feature or request feature New feature labels Jul 1, 2023
@JieRen98 JieRen98 changed the title feat(optim): AdaDelta optimizer support feat(optim): AdaDelta RAdam Adamax optimizer support Jul 2, 2023
Benjamin-eecs

This comment was marked as outdated.

@Benjamin-eecs Benjamin-eecs linked an issue Jul 22, 2023 that may be closed by this pull request
3 tasks
Benjamin-eecs

This comment was marked as outdated.

Benjamin-eecs

This comment was marked as outdated.

Benjamin-eecs
Benjamin-eecs previously approved these changes Jul 22, 2023
@Benjamin-eecs Benjamin-eecs self-requested a review July 22, 2023 20:32
@Benjamin-eecs Benjamin-eecs merged commit ef51cc8 into metaopt:main Jul 22, 2023
7 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request feature New feature
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[BUG] Naming of Adagrad optimizer
3 participants