Skip to content
This repository was archived by the owner on Apr 23, 2025. It is now read-only.

Commit a96283a

Browse files
committed
update documentation
1 parent 10a9527 commit a96283a

File tree

2 files changed

+33
-4
lines changed

2 files changed

+33
-4
lines changed

docs/source/methods.rst

Lines changed: 7 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -29,12 +29,16 @@ Posterior approximation methods
2929
taken by averaging checkpoints over the stochastic optimization trajectory. The covariance is also estimated
3030
empirically along the trajectory, and it is made of a diagonal component and a low-rank non-diagonal one.
3131

32+
- **Hamiltonian Monte Carlo (HMC)** `[Neal, 2010] <https://arxiv.org/pdf/1206.1901.pdf>`_
33+
HMC approximates the posterior as a steady-state distribution of a Monte Carlo Markov chain with Hamiltonian dynamics.
34+
After the initial "burn-in" phase, each step of the chain generates a sample from the posterior. HMC is typically applied
35+
in the full-batch scenario.
36+
3237
- **Stochastic Gradient Hamiltonian Monte Carlo (SGHMC)** `[Chen et al., 2014] <http://proceedings.mlr.press/v32/cheni14.pdf>`_
33-
SGHMC approximates the posterior as a steady-state distribution of a Monte Carlo Markov chain with Hamiltonian dynamics.
34-
After the initial "burn-in" phase, each step of the chain generates samples from the posterior.
38+
SGHMC implements a variant of HMC algorithm that expects noisy gradient estimate computed on mini-batches of data.
3539

3640
- **Cyclical Stochastic Gradient Langevin Dynamics (Cyclical SGLD)** `[Zhang et al., 2020] <https://openreview.net/pdf?id=rkeS1RVtPS>`_
37-
Cyclical SGLD adapts the cyclical cosine step size schedule, and alternates between *exploration* and *sampling* stages to better
41+
Cyclical SGLD adopts the cyclical cosine step size schedule, and alternates between *exploration* and *sampling* stages to better
3842
explore the multimodal posteriors for deep neural networks.
3943

4044
Parametric calibration methods

docs/source/references/prob_model/posterior/sgmcmc.rst

Lines changed: 26 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,12 +4,37 @@ SG-MCMC procedures approximate the posterior as a steady-state distribution of
44
a Monte Carlo Markov chain, that utilizes noisy estimates of the gradient
55
computed on minibatches of data.
66

7+
Hamiltonian Monte Carlo (HMC)
8+
=============================
9+
10+
HMC `[Neal, 2010] <https://arxiv.org/pdf/1206.1901.pdf>`_ is a MCMC sampling
11+
algorithm that simulates a Hamiltonian dynamical system to rapidly explores
12+
the posterior.
13+
14+
.. autoclass:: fortuna.prob_model.posterior.sgmcmc.hmc.hmc_approximator.HMCPosteriorApproximator
15+
16+
.. autoclass:: fortuna.prob_model.posterior.sgmcmc.hmc.hmc_posterior.HMCPosterior
17+
:show-inheritance:
18+
:no-inherited-members:
19+
:exclude-members: state
20+
:members: fit, sample, load_state, save_state
21+
22+
.. autoclass:: fortuna.prob_model.posterior.sgmcmc.hmc.hmc_state.HMCState
23+
:show-inheritance:
24+
:no-inherited-members:
25+
:inherited-members: init, init_from_dict
26+
:members: convert_from_map_state
27+
:exclude-members: params, mutable, calib_params, calib_mutable, replace, apply_gradients, encoded_name, create
28+
:no-undoc-members:
29+
:no-special-members:
30+
31+
732
Stochastic Gradient Hamiltonian Monte Carlo (SGHMC)
833
===================================================
934

1035
SGHMC `[Chen T. et al., 2014] <http://proceedings.mlr.press/v32/cheni14.pdf>`_
1136
is a popular MCMC algorithm that uses stochastic gradient estimates to scale
12-
to large datasets.
37+
HMC to large datasets.
1338

1439
.. autoclass:: fortuna.prob_model.posterior.sgmcmc.sghmc.sghmc_approximator.SGHMCPosteriorApproximator
1540

0 commit comments

Comments
 (0)