Skip to content

A Unified Pytorch Optimizer for Numerical Optimization

License

Notifications You must be signed in to change notification settings

MagiFeeney/MagiOPT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Why MagiOPT?

  • Unified framework for both uncontrained and constrained optimization
  • Efficient and empowered by automatic differentiation engine of Pytorch
  • Use-friendly and modifiable whenever you want
  • Visualizable for both surface (or curve) and contour plot

Installation

$ git clone ...
$ cd MagiOPT

How to use

  • Unconstrained
import MagiOPT as optim
    
def func(x):
    ...
        
optimizer = optim.SD(func) # Steepest Descent
x = optimizer.step(x0)     # On-the-fly
optimizer.plot()           # Visualize
  • Constrained
import MagiOPT as optim
    
def object(x):
    ...
def constr1(x):
    ...
def constr2(x):
    ...
...
    
optimizer = optim.Penalty(object, 
                          sigma, 
                          (constr1, '<='), 
                          (constr2, '>='), 
                          plot=True)        # Penalty methoed
optimizer.BFGS()                            # Inner optimizer
x = optimizer.step(x0)                      # On-the-fly

Supported Optimizers

Unconstrained Constrained
Steepest Descent Penalty Method
Amortized Newton Method Log-Barrier Method
SR1 Inverse-Barrier Method
DFP Augmented Lagrangian Method
BFGS
Broyden
FR
PRP
CG for Qudratic Function
CG for Linear Equation
BB1
BB2
Gauss-Newton
LMF
Dogleg

Visualization

Use a simple line of code for unconstrained optimizer

optimizer.plot()

we can visualize the 2D curve with iterated sequence, such that

Or with the 3D surface with iterated sequence, and its contour with iterated sequence.

Or use

optimizer = optim.Penalty(..., plot=True)

we can visualize the function and sequence of each inner iteration with the 3D surface with iterated sequence, and its contour with iterated sequence.

Reminder

  • Majority of algorithms are sensitive to initial point; choosing properly will save a lot of your effort
  • Due to ill-conditioned situation, constrained optimizer may need you to trial-and-error
  • The behavior of Barzilai-Borwein method is not stable for non-qudratic problem, however, you can still infer path through an intermediate visualization
  • You can extract the sequence easily by
    optimization.sequence
  • Your function should be supported by torch operation, however, not necessarily to your input, which can be numpy array, torch tensor or even a list

Requirements

  • Pytorch 3.7 or above