Skip to content

Commit

Permalink
update documentation
Browse files Browse the repository at this point in the history
  • Loading branch information
master committed Jul 30, 2023
1 parent 10a9527 commit a96283a
Show file tree
Hide file tree
Showing 2 changed files with 33 additions and 4 deletions.
10 changes: 7 additions & 3 deletions docs/source/methods.rst
Original file line number Diff line number Diff line change
Expand Up @@ -29,12 +29,16 @@ Posterior approximation methods
taken by averaging checkpoints over the stochastic optimization trajectory. The covariance is also estimated
empirically along the trajectory, and it is made of a diagonal component and a low-rank non-diagonal one.

- **Hamiltonian Monte Carlo (HMC)** `[Neal, 2010] <https://arxiv.org/pdf/1206.1901.pdf>`_
HMC approximates the posterior as a steady-state distribution of a Monte Carlo Markov chain with Hamiltonian dynamics.
After the initial "burn-in" phase, each step of the chain generates a sample from the posterior. HMC is typically applied
in the full-batch scenario.

- **Stochastic Gradient Hamiltonian Monte Carlo (SGHMC)** `[Chen et al., 2014] <http://proceedings.mlr.press/v32/cheni14.pdf>`_
SGHMC approximates the posterior as a steady-state distribution of a Monte Carlo Markov chain with Hamiltonian dynamics.
After the initial "burn-in" phase, each step of the chain generates samples from the posterior.
SGHMC implements a variant of HMC algorithm that expects noisy gradient estimate computed on mini-batches of data.

- **Cyclical Stochastic Gradient Langevin Dynamics (Cyclical SGLD)** `[Zhang et al., 2020] <https://openreview.net/pdf?id=rkeS1RVtPS>`_
Cyclical SGLD adapts the cyclical cosine step size schedule, and alternates between *exploration* and *sampling* stages to better
Cyclical SGLD adopts the cyclical cosine step size schedule, and alternates between *exploration* and *sampling* stages to better
explore the multimodal posteriors for deep neural networks.

Parametric calibration methods
Expand Down
27 changes: 26 additions & 1 deletion docs/source/references/prob_model/posterior/sgmcmc.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,12 +4,37 @@ SG-MCMC procedures approximate the posterior as a steady-state distribution of
a Monte Carlo Markov chain, that utilizes noisy estimates of the gradient
computed on minibatches of data.

Hamiltonian Monte Carlo (HMC)
=============================

HMC `[Neal, 2010] <https://arxiv.org/pdf/1206.1901.pdf>`_ is a MCMC sampling
algorithm that simulates a Hamiltonian dynamical system to rapidly explores
the posterior.

.. autoclass:: fortuna.prob_model.posterior.sgmcmc.hmc.hmc_approximator.HMCPosteriorApproximator

.. autoclass:: fortuna.prob_model.posterior.sgmcmc.hmc.hmc_posterior.HMCPosterior
:show-inheritance:
:no-inherited-members:
:exclude-members: state
:members: fit, sample, load_state, save_state

.. autoclass:: fortuna.prob_model.posterior.sgmcmc.hmc.hmc_state.HMCState
:show-inheritance:
:no-inherited-members:
:inherited-members: init, init_from_dict
:members: convert_from_map_state
:exclude-members: params, mutable, calib_params, calib_mutable, replace, apply_gradients, encoded_name, create
:no-undoc-members:
:no-special-members:


Stochastic Gradient Hamiltonian Monte Carlo (SGHMC)
===================================================

SGHMC `[Chen T. et al., 2014] <http://proceedings.mlr.press/v32/cheni14.pdf>`_
is a popular MCMC algorithm that uses stochastic gradient estimates to scale
to large datasets.
HMC to large datasets.

.. autoclass:: fortuna.prob_model.posterior.sgmcmc.sghmc.sghmc_approximator.SGHMCPosteriorApproximator

Expand Down

0 comments on commit a96283a

Please sign in to comment.