Skip to content

Commit

Permalink
Adapt / Bump to ManifoldsBase 1.0 (#437)
Browse files Browse the repository at this point in the history
* adapt version.
* A bit of work on typos.
* Update metadata.
* Fix tests.
* Runs formatter.
* Make sure all docs and tutorials run on the new versions.
* Rename optimize! to getstarted and bump version.
* Fix make.jl links and others in the docs that can be md links by now.
  • Loading branch information
kellertuer authored Feb 10, 2025
1 parent b0e5ac0 commit c0f89f3
Show file tree
Hide file tree
Showing 40 changed files with 159 additions and 120 deletions.
2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ tutorials/img
docs/src/tutorials/*/
docs/src/tutorials/*.md
docs/.CondaPkg
docs/src/tutorials/Optimize!_files
docs/src/tutorials/getstarted_files
docs/src/tutorials/*.html
docs/src/changelog.md
docs/styles/Google
Expand Down
16 changes: 11 additions & 5 deletions .vale.ini
Original file line number Diff line number Diff line change
Expand Up @@ -7,12 +7,16 @@ Packages = Google
[formats]
# code blocks with Julia in Markdown do not yet work well
qmd = md
jl = md

[docs/src/*.md]
BasedOnStyles = Vale, Google

[docs/src/contributing.md]
BasedOnStyles =
BasedOnStyles = Vale, Google
Google.Will = false ; given format and really with intend a _will_
Google.Headings = false ; some might jeally ahabe [] in their headers
Google.FirstPerson = false ; we pose a few contribution points as first-person questions

[Changelog.md, CONTRIBUTING.md]
BasedOnStyles = Vale, Google
Expand All @@ -39,12 +43,14 @@ TokenIgnores = \$(.+)\$,\[.+?\]\(@(ref|id|cite).+?\),`.+`,``.*``,\s{4}.+\n
Google.Units = false #wto ignore formats= for now.
TokenIgnores = \$(.+)\$,\[.+?\]\(@(ref|id|cite).+?\),`.+`,``.*``,\s{4}.+\n

[tutorials/*.md] ; actually .qmd for the first, second autogenerated
[tutorials/*.qmd] ; actually .qmd for the first, second autogenerated
BasedOnStyles = Vale, Google
; ignore (1) math (2) ref and cite keys (3) code in docs (4) math in docs (5,6) indented blocks
TokenIgnores = (\$+[^\n$]+\$+)
Google.We = false # For tutorials we want to address the user directly.

[docs/src/tutorials/*.md]
; ignore since they are derived files
BasedOnStyles =
[docs/src/tutorials/*.md] ; actually .qmd for the first, second autogenerated
BasedOnStyles = Vale, Google
; ignore (1) math (2) ref and cite keys (3) code in docs (4) math in docs (5,6) indented blocks
TokenIgnores = (\$+[^\n$]+\$+)
Google.We = false # For tutorials we want to address the user directly.
5 changes: 5 additions & 0 deletions .zenodo.json
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,11 @@
"name": "Riemer, Tom-Christian",
"type": "ProjectMember"
},
{
"affiliation": "NTNU Trondheim",
"name": "Oddsen, Sander Engen",
"type": "ProjectMember"
},
{
"name": "Schilly, Harald",
"type": "Other"
Expand Down
10 changes: 8 additions & 2 deletions Changelog.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,13 @@ All notable Changes to the Julia package `Manopt.jl` will be documented in this
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [0.5.5] Januaey 4, 2025
## [0.5.6] February 10, 2025

### Changed

* bump dependencies of all JuliaManifolds ecosystem packages to be consistent with ManifoldsBase 1.0

## [0.5.5] January 4, 2025

### Added

Expand Down Expand Up @@ -122,7 +128,7 @@ In general we introduce a few factories, that avoid having to pass the manifold
* the previous `stabilize=true` is now set with `(project!)=embed_project!` in general,
and if the manifold is represented by points in the embedding, like the sphere, `(project!)=project!` suffices
* the new default is `(project!)=copyto!`, so by default no projection/stabilization is performed.
* the positional argument `p` (usually the last or the third to last if subsolvers existed) has been moved to a keyword argument `p=` in all State constructors
* the positional argument `p` (usually the last or the third to last if sub solvers existed) has been moved to a keyword argument `p=` in all State constructors
* in `NelderMeadState` the `population` moved from positional to keyword argument as well,
* the way to initialise sub solvers in the solver states has been unified In the new variant
* the `sub_problem` is always a positional argument; namely the last one
Expand Down
4 changes: 2 additions & 2 deletions Project.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
name = "Manopt"
uuid = "0fc0a36d-df90-57f3-8f93-d78a9fc72bb5"
authors = ["Ronny Bergmann <[email protected]>"]
version = "0.5.5"
version = "0.5.6"

[deps]
ColorSchemes = "35d6a980-a343-548e-a6ea-1d62b119f2f4"
Expand Down Expand Up @@ -51,7 +51,7 @@ LineSearches = "7.2.0"
LinearAlgebra = "1.10"
ManifoldDiff = "0.3.8, 0.4"
Manifolds = "0.9.11, 0.10"
ManifoldsBase = "0.15.18"
ManifoldsBase = "0.15.18, 1.0"
ManoptExamples = "0.1.10"
Markdown = "1.10"
Plots = "1.30"
Expand Down
2 changes: 1 addition & 1 deletion Readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ In Julia you can get started by just typing
using Pkg; Pkg.add("Manopt");
```

and then checkout the [Get started: optimize!](https://manoptjl.org/stable/tutorials/Optimize/) tutorial.
and then checkout the [🏔️ Get started with Manopt.jl](https://manoptjl.org/stable/tutorials/getstarted/) tutorial.

## Related packages

Expand Down
6 changes: 3 additions & 3 deletions docs/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -40,9 +40,9 @@ JuMP = "1"
LRUCache = "1"
LineSearches = "7"
Literate = "2"
Manifolds = "0.8.81, 0.9, 0.10"
ManifoldsBase = "0.14.12, 0.15"
Manopt = "0.4, 0.5"
Manifolds = "0.10"
ManifoldsBase = "1"
Manopt = "0.5"
Plots = "1"
QuadraticModels = "0.9.6"
RecursiveArrayTools = "2, 3"
Expand Down
9 changes: 5 additions & 4 deletions docs/make.jl
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ Arguments
Then you can spare time in the rendering by not passing this argument.
If quarto is not run, some tutorials are generated as empty files, since they
are referenced from within the documentation.
These are currently `Optimize.md` and `ImplementOwnManifold.md`.
These are currently `getstarted.md` and `ImplementOwnManifold.md`.
"""
)
exit(0)
Expand All @@ -35,7 +35,7 @@ tutorials_in_menu = !("--exclude-tutorials" ∈ ARGS)
# (a) setup the tutorials menu – check whether all files exist
tutorials_menu =
"How to..." => [
"🏔️ Get started: optimize." => "tutorials/Optimize.md",
"🏔️ Get started with Manopt.jl" => "tutorials/getstarted.md",
"Speedup using in-place computations" => "tutorials/InplaceGradient.md",
"Use automatic differentiation" => "tutorials/AutomaticDifferentiation.md",
"Define objectives in the embedding" => "tutorials/EmbeddingObjectives.md",
Expand All @@ -54,8 +54,9 @@ for (name, file) in tutorials_menu.second
global all_tutorials_exist = false
if !run_quarto
@warn "Tutorial $name does not exist at $fn."
if (!isfile(fn)) &&
(endswith(file, "Optimize.md") || endswith(file, "ImplementOwnManifold.md"))
if (!isfile(fn)) && (
endswith(file, "getstarted.md") || endswith(file, "ImplementOwnManifold.md")
)
@warn "Generating empty file, since this tutorial is linked to from the documentation."
touch(fn)
end
Expand Down
4 changes: 2 additions & 2 deletions docs/src/about.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,8 +28,8 @@ to clone/fork the repository or open an issue.
* [ExponentialFamilyProjection.jl](https://github.com/ReactiveBayes/ExponentialFamilyProjection.jl) package uses `Manopt.jl` to project arbitrary functions onto the closest exponential family distributions. The package also integrates with [`RxInfer.jl`](https://github.com/ReactiveBayes/RxInfer.jl) to enable Bayesian inference in a larger set of probabilistic models.
* [Caesar.jl](https://github.com/JuliaRobotics/Caesar.jl) within non-Gaussian factor graph inference algorithms

Is a package missing? [Open an issue](https://github.com/JuliaManifolds/Manopt.jl/issues/new)!
It would be great to collect anything and anyone using Manopt.jl
If you are missing a package, that uses `Manopt.jl`, please [open an issue](https://github.com/JuliaManifolds/Manopt.jl/issues/new).
It would be great to collect anything and anyone using Manopt.jl in this list.

## Further packages

Expand Down
4 changes: 2 additions & 2 deletions docs/src/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ or in other words: find the point ``p`` on the manifold, where ``f`` reaches its
It belongs to the “Manopt family”, which includes [Manopt](https://manopt.org) (Matlab) and [pymanopt.org](https://www.pymanopt.org/) (Python).

If you want to delve right into `Manopt.jl` read the
[🏔️ Get started: optimize.](tutorials/Optimize.md) tutorial.
[🏔️ Get started with Manopt.jl](tutorials/getstarted.md) tutorial.

`Manopt.jl` makes it easy to use an algorithm for your favourite
manifold as well as a manifold for your favourite algorithm. It already provides
Expand Down Expand Up @@ -94,7 +94,7 @@ The notation in the documentation aims to follow the same [notation](https://jul
### Visualization

To visualize and interpret results, `Manopt.jl` aims to provide both easy plot functions as well as [exports](helpers/exports.md). Furthermore a system to get [debug](plans/debug.md) during the iterations of an algorithms as well as [record](plans/record.md) capabilities, for example to record a specified tuple of values per iteration, most prominently [`RecordCost`](@ref) and
[`RecordIterate`](@ref). Take a look at the [🏔️ Get started: optimize.](tutorials/Optimize.md) tutorial on how to easily activate this.
[`RecordIterate`](@ref). Take a look at the [🏔️ Get started with Manopt.jl](tutorials/getstarted.md) tutorial on how to easily activate this.

## Literature

Expand Down
4 changes: 2 additions & 2 deletions docs/src/solvers/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,7 @@ For these you can use
* [Steihaug-Toint Truncated Conjugate-Gradient Method](truncated_conjugate_gradient_descent.md) a solver for a constrained problem defined on a tangent space.


## Alphabetical list List of algorithms
## Alphabetical list of algorithms

| Solver | Function | State |
|:---------|:----------------|:---------|
Expand Down Expand Up @@ -202,7 +202,7 @@ also use the third (lowest level) and just call
solve!(problem, state)
```

### Closed-form subsolvers
### Closed-form sub solvers

If a subsolver solution is available in closed form, `ClosedFormSubSolverState` is used to indicate that.

Expand Down
67 changes: 35 additions & 32 deletions docs/src/tutorials/InplaceGradient.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ When it comes to time critical operations, a main ingredient in Julia is given b
mutating functions, that is those that compute in place without additional memory
allocations. In the following, we illustrate how to do this with `Manopt.jl`.

Let’s start with the same function as in [Get started: optimize!](https://manoptjl.org/stable/tutorials/Optimize!.html)
Let’s start with the same function as in [🏔️ Get started with Manopt.jl](https://manoptjl.org/stable/tutorials/getstarted.html)
and compute the mean of some points, only that here we use the sphere $\mathbb S^{30}$
and $n=800$ points.

Expand All @@ -32,7 +32,7 @@ p[2] = 1.0
data = [exp(M, p, σ * rand(M; vector_at=p)) for i in 1:n];
```

## Classical Definition
## Classical definition

The variant from the previous tutorial defines a cost $f(x)$ and its gradient $\operatorname{grad}f(p)$
““”
Expand All @@ -58,26 +58,26 @@ We can also benchmark this as
@benchmark gradient_descent($M, $f, $grad_f, $p0; stopping_criterion=$sc)
```

BenchmarkTools.Trial: 106 samples with 1 evaluation.
Range (min … max): 46.774 ms … 50.326 ms ┊ GC (min … max): 2.31% … 2.47%
Time (median): 47.207 ms ┊ GC (median): 2.45%
Time (mean ± σ): 47.364 ms ± 608.514 μs ┊ GC (mean ± σ): 2.53% ± 0.25%
BenchmarkTools.Trial: 89 samples with 1 evaluation per sample.
Range (min … max): 52.976 ms … 104.222 ms ┊ GC (min … max): 8.05% … 5.55%
Time (median): 55.145 ms ┊ GC (median): 9.99%
Time (mean ± σ): 56.391 ms ± 6.102 ms ┊ GC (mean ± σ): 9.92% ± 1.43%

▄▇▅▇█▄▇
▇▆████████▇▇▅▅▃▁▆▁▁▁▅▁▁▅▁▃▃▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▅ ▃
46.8 ms Histogram: frequency by time 50.2 ms <
▅██▅▃▁
▅███████▁▅▇▅▁▅▁▁▅▅▁▁▁▅▅▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▅ ▁
53 ms Histogram: log(frequency) by time 81.7 ms <

Memory estimate: 182.50 MiB, allocs estimate: 615822.
Memory estimate: 173.54 MiB, allocs estimate: 1167348.

## In-place Computation of the Gradient
## In-place computation of the gradient

We can reduce the memory allocations by implementing the gradient to be evaluated in-place.
We do this by using a [functor](https://docs.julialang.org/en/v1/manual/methods/#Function-like-objects).
The motivation is twofold: on one hand, we want to avoid variables from the global scope,
for example the manifold `M` or the `data`, being used within the function.
Considering to do the same for more complicated cost functions might also be worth pursuing.

Here, we store the data (as reference) and one introduce temporary memory in order to avoid
Here, we store the data (as reference) and one introduce temporary memory to avoid
reallocation of memory per `grad_distance` computation. We get

``` julia
Expand Down Expand Up @@ -116,16 +116,16 @@ We can again benchmark this
) setup = (m2 = deepcopy($p0))
```

BenchmarkTools.Trial: 176 samples with 1 evaluation.
Range (min … max): 27.358 ms … 84.206 ms ┊ GC (min … max): 0.00% … 0.00%
Time (median): 27.768 ms ┊ GC (median): 0.00%
Time (mean ± σ): 28.504 ms ± 4.338 ms ┊ GC (mean ± σ): 0.60% ± 1.96%
BenchmarkTools.Trial: 130 samples with 1 evaluation per sample.
Range (min … max): 36.646 ms … 64.781 ms ┊ GC (min … max): 0.00% … 0.00%
Time (median): 37.559 ms ┊ GC (median): 0.00%
Time (mean ± σ): 38.658 ms ± 3.904 ms ┊ GC (mean ± σ): 0.73% ± 2.68%

▂█▇▂ ▂
▆▇████▆█▆▆▄▄▃▄▄▃▃▃▁▃▃▃▃▃▃▃▃▃▄▃▃▃▃▃▃▁▃▁▁▃▁▁▁▁▁▁▃▃▁▁▃▃▁▁▁▁▃▃▃ ▃
27.4 ms Histogram: frequency by time 31.4 ms <
██▅▅▄▂▁ ▂
███████▁██▁▅▁▁▁▅▁▁▁▁▅▅▅▁▁▁▅▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▅▁▁▁▁▁▁▁▁▁▅ ▅
36.6 ms Histogram: log(frequency) by time 61 ms <

Memory estimate: 3.83 MiB, allocs estimate: 5797.
Memory estimate: 3.59 MiB, allocs estimate: 6863.

which is faster by about a factor of 2 compared to the first solver-call.
Note that the results `m1` and `m2` are of course the same.
Expand All @@ -134,7 +134,7 @@ Note that the results `m1` and `m2` are of course the same.
distance(M, m1, m2)
```

2.4669338186126805e-17
4.8317610992693745e-11

## Technical details

Expand All @@ -146,21 +146,24 @@ Pkg.status()
```

Status `~/Repositories/Julia/Manopt.jl/tutorials/Project.toml`
[6e4b80f9] BenchmarkTools v1.5.0
[5ae59095] Colors v0.12.11
[31c24e10] Distributions v0.25.108
[26cc04aa] FiniteDifferences v0.12.31
[7073ff75] IJulia v1.24.2
[47edcb42] ADTypes v1.13.0
[6e4b80f9] BenchmarkTools v1.6.0
⌃ [5ae59095] Colors v0.12.11
[31c24e10] Distributions v0.25.117
[26cc04aa] FiniteDifferences v0.12.32
[7073ff75] IJulia v1.26.0
[8ac3fa9e] LRUCache v1.6.1
[af67fdf4] ManifoldDiff v0.3.10
[1cead3c2] Manifolds v0.9.18
[3362f125] ManifoldsBase v0.15.10
[0fc0a36d] Manopt v0.4.63 `..`
[91a5bcdd] Plots v1.40.4
[af67fdf4] ManifoldDiff v0.4.2
[1cead3c2] Manifolds v0.10.13
[3362f125] ManifoldsBase v1.0.1
[0fc0a36d] Manopt v0.5.5 `..`
[91a5bcdd] Plots v1.40.9
[731186ca] RecursiveArrayTools v3.29.0
Info Packages marked with ⌃ have new versions available and may be upgradable.

``` julia
using Dates
now()
```

2024-05-26T13:52:05.613
2025-02-10T13:22:51.002
5 changes: 5 additions & 0 deletions docs/styles/config/vocabularies/Manopt/accept.txt
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ Cartis
canonicalization
canonicalized
Constantin
[Cc]ubics
Dai
deactivatable
Diepeveen
Expand Down Expand Up @@ -76,9 +77,11 @@ Munkvold
[Mm]ead
[Nn]elder
Nesterov
Nesterovs
Newton
nonmonotone
nonpositive
[Nn]onsmooth
[Pp]arametrising
Parametrising
[Pp]ock
Expand Down Expand Up @@ -110,9 +113,11 @@ Stephansen
Stokkenes
[Ss]ubdifferential
[Ss]ubgradient
[Ss]ubgradients
subsampled
[Ss]ubsolver
summand
summands
superlinear
supertype
th
Expand Down
2 changes: 1 addition & 1 deletion ext/ManoptLineSearchesExt.jl
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ function (cs::Manopt.LineSearchesStepsize)(
end
function ϕdϕ(α)
# TODO: optimize?
retract!(M, p_tmp, p, η, α, cs.retraction_method)
ManifoldsBase.retract_fused!(M, p_tmp, p, η, α, cs.retraction_method)
get_gradient!(mp, X_tmp, p_tmp)
vector_transport_to!(M, Y_tmp, p, η, p_tmp, cs.vector_transport_method)
phi = f(M, p_tmp)
Expand Down
1 change: 1 addition & 0 deletions src/Manopt.jl
Original file line number Diff line number Diff line change
Expand Up @@ -53,6 +53,7 @@ using ManifoldDiff:
riemannian_gradient!,
riemannian_Hessian,
riemannian_Hessian!
using ManifoldsBase
using ManifoldsBase:
AbstractBasis,
AbstractDecoratorManifold,
Expand Down
4 changes: 2 additions & 2 deletions src/helpers/checks.jl
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ function check_differential(
#
T = exp10.(log_range)
# points `p_i` to evaluate the error function at
points = map(t -> retract(M, p, Xn, t, retraction_method), T)
points = map(t -> ManifoldsBase.retract_fused(M, p, Xn, t, retraction_method), T)
costs = [F(M, pi) for pi in points]
# linearized
linearized = map(t -> F(M, p) + t * dF(M, p, Xn), T)
Expand Down Expand Up @@ -297,7 +297,7 @@ function check_Hessian(
#
T = exp10.(log_range)
# points `p_i` to evaluate error function at
points = map(t -> retract(M, p, X_n, t, retraction_method), T)
points = map(t -> ManifoldsBase.retract_fused(M, p, X_n, t, retraction_method), T)
# corresponding costs
costs = [f(M, pi) for pi in points]
# linearized
Expand Down
Loading

2 comments on commit c0f89f3

@kellertuer
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JuliaRegistrator register

Release notes:

Changed

  • bump dependencies of all JuliaManifolds ecosystem packages to be consistent with ManifoldsBase 1.0

@JuliaRegistrator
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Registration pull request created: JuliaRegistries/General/124715

Tagging

After the above pull request is merged, it is recommended that a tag is created on this repository for the registered package version.

This will be done automatically if the Julia TagBot GitHub Action is installed, or can be done manually through the github interface, or via:

git tag -a v0.5.6 -m "<description of version>" c0f89f36606ada59aac2f36d9daa837943c08d03
git push origin v0.5.6

Please sign in to comment.