Skip to content

lookahead optimizer (Lookahead Optimizer: k steps forward, 1 step back) for pytorch

License

Notifications You must be signed in to change notification settings

alphadl/lookahead.pytorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 

Repository files navigation

lookahead optimizer for pytorch

License Star Fork Twitter

PyTorch implement of Lookahead Optimizer: k steps forward, 1 step back

Usage:

base_opt = torch.optim.Adam(model.parameters(), lr=1e-3, betas=(0.9, 0.999)) # Any optimizer
lookahead = Lookahead(base_opt, k=5, alpha=0.5) # Initialize Lookahead
lookahead.zero_grad()
loss_function(model(input), target).backward() # Self-defined loss function
lookahead.step()

lookahead优化器的PyTorch实现

论文《Lookahead Optimizer: k steps forward, 1 step back》的PyTorch实现。

用法:

base_opt = torch.optim.Adam(model.parameters(), lr=1e-3, betas=(0.9, 0.999)) # 用你想用的优化器
lookahead = Lookahead(base_opt, k=5, alpha=0.5) # 初始化Lookahead
lookahead.zero_grad()
loss_function(model(input), target).backward() # 自定义的损失函数
lookahead.step()

中文介绍:https://mp.weixin.qq.com/s/3J-28xd0pyToSy8zzKs1RA

avatar

About

lookahead optimizer (Lookahead Optimizer: k steps forward, 1 step back) for pytorch

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages