Stepsize and line search

Most iterative algorithms determine a direction along which the algorithm shall proceed and determine a step size to find the next iterate. How advanced the step size computation can be implemented depends (among others) on the properties the corresponding problem provides.

Within Manopt.jl, the step size determination is implemented as a functor which is a subtype of Stepsize based on

Manopt.StepsizeType
Stepsize

An abstract type for the functors representing step sizes. These are callable structures. The naming scheme is TypeOfStepSize, for example ConstantStepsize.

Every Stepsize has to provide a constructor and its function has to have the interface (p,o,i) where a AbstractManoptProblem as well as AbstractManoptSolverState and the current number of iterations are the arguments and returns a number, namely the stepsize to use.

For most it is adviable to employ a ManifoldDefaultsFactory. Then the function creating the factory should either be called TypeOf or if that is confusing or too generic, TypeOfLength

See also

Linesearch

source

Usually, a constructor should take the manifold M as its first argument, for consistency, to allow general step size functors to be set up based on default values that might depend on the manifold currently under consideration.

Currently, the following step sizes are available

Manopt.AdaptiveWNGradientFunction
AdaptiveWNGradient(; kwargs...)
AdaptiveWNGradient(M::AbstractManifold; kwargs...)

A stepsize based on the adaptive gradient method introduced by [GS23].

Given a positive threshold $\hat{c} ∈ ℕ$, an minimal bound $b_{\text{min}} > 0$, an initial $b_0 ≥ b_{\text{min}}$, and a gradient reduction factor threshold $α ∈ [0,1)$.

Set $c_0=0$ and use $ω_0 = \lVert \operatorname{grad} f(p_0) \rVert_{p_0}$.

For the first iterate use the initial step size $s_0 = \frac{1}{b_0}$.

Then, given the last gradient $X_{k-1} = \operatorname{grad} f(x_{k-1})$, and a previous $ω_{k-1}$, the values $(b_k, ω_k, c_k)$ are computed using $X_k = \operatorname{grad} f(p_k)$ and the following cases

If $\lVert X_k \rVert_{p_k} ≤ αω_{k-1}$, then let $\hat{b}_{k-1} ∈ [b_{\text{min}},b_{k-1}]$ and set

\[(b_k, ω_k, c_k) = \begin{cases} \bigl(\hat{b}_{k-1}, \lVert X_k \rVert_{p_k}, 0 \bigr) & \text{ if } c_{k-1}+1 = \hat{c}\\ \bigl( b_{k-1} + \frac{\lVert X_k \rVert_{p_k}^2}{b_{k-1}}, ω_{k-1}, c_{k-1}+1 \Bigr) & \text{ if } c_{k-1}+1<\hat{c} \end{cases}\]

If $\lVert X_k \rVert_{p_k} > αω_{k-1}$, the set

\[(b_k, ω_k, c_k) = \Bigl( b_{k-1} + \frac{\lVert X_k \rVert_{p_k}^2}{b_{k-1}}, ω_{k-1}, 0 \Bigr)\]

and return the step size $s_k = \frac{1}{b_k}$.

Note that for $α=0$ this is the Riemannian variant of WNGRad.

Keyword arguments

  • adaptive=true: switches the gradient_reductionα(iftrue) to0`.
  • alternate_bound = (bk, hat_c) -> min(gradient_bound == 0 ? 1.0 : gradient_bound, max(minimal_bound, bk / (3 * hat_c)): how to determine $\hat{k}_k$ as a function of (bmin, bk, hat_c) -> hat_bk
  • count_threshold=4: an Integer for $\hat{c}$
  • gradient_reduction::R=adaptive ? 0.9 : 0.0: the gradient reduction factor threshold $α ∈ [0,1)$
  • gradient_bound=norm(M, p, X): the bound $b_k$.
  • minimal_bound=1e-4: the value $b_{\text{min}}$
  • p=rand(M): a point on the manifold $\mathcal M$only used to define the gradient_bound
  • X=zero_vector(M, p): a tangent vector at the point $p$ on the manifold $\mathcal M$only used to define the gradient_bound
source
Manopt.ArmijoLinesearchFunction
ArmijoLinesearch(; kwargs...)
ArmijoLinesearch(M::AbstractManifold; kwargs...)

Specify a step size that performs an Armijo line search. Given a Function $f:\mathcal M→ℝ$ and its Riemannian Gradient $\operatorname{grad}f: \mathcal M→T\mathcal M$, the curent point $p∈\mathcal M$ and a search direction $X∈T_{p}\mathcal M$.

Then the step size $s$ is found by reducing the initial step size $s$ until

\[f(\operatorname{retr}_p(sX)) ≤ f(p) - τs ⟨ X, \operatorname{grad}f(p) ⟩_p\]

is fulfilled. for a sufficient decrease value $τ ∈ (0,1)$.

To be a bit more optimistic, if $s$ already fulfils this, a first search is done, increasing the given $s$ until for a first time this step does not hold.

Overall, we look for step size, that provides enough decrease, see [Bou23, p. 58] for more information.

Keyword arguments

  • additional_decrease_condition=(M, p) -> true: specify an additional criterion that has to be met to accept a step size in the decreasing loop
  • additional_increase_condition::IF=(M, p) -> true: specify an additional criterion that has to be met to accept a step size in the (initial) increase loop
  • candidate_point=allocate_result(M, rand): speciy a point to be used as memory for the candidate points.
  • contraction_factor=0.95: how to update $s$ in the decrease step
  • initial_stepsize=1.0`: specify an initial step size
  • initial_guess=armijo_initial_guess: Compute the initial step size of a line search based on this function. The funtion required is (p,s,k,l) -> α and computes the initial step size $α$ based on a AbstractManoptProblem p, AbstractManoptSolverState s, the current iterate k and a last step size l.
  • retraction_method=default_retraction_method(M, typeof(p)): a retraction $\operatorname{retr}$ to use, see the section on retractions
  • stop_when_stepsize_less=0.0: a safeguard, stop when the decreasing step is below this (nonnegative) bound.
  • stop_when_stepsize_exceeds=max_stepsize(M): a safeguard to not choose a too long step size when initially increasing
  • stop_increasing_at_step=100: stop the initial increasing loop after this amount of steps. Set to 0 to never increase in the beginning
  • stop_decreasing_at_step=1000: maximal number of Armijo decreases / tests to perform
  • sufficient_decrease=0.1: the sufficient decrease parameter $τ$

For the stop safe guards you can pass :Messages to a debug= to see @info messages when these happen.

Info

This function generates a ManifoldDefaultsFactory for ArmijoLinesearchStepsize. For default values, that depend on the manifold, this factory postpones the construction until the manifold from for example a corresponding AbstractManoptSolverState is available.

source
Manopt.ConstantLengthFunction
ConstantLength(s; kwargs...)
ConstantLength(M::AbstractManifold, s; kwargs...)

Specify a Stepsize that is constant.

Input

  • M (optional)

s=min( injectivity_radius(M)/2, 1.0) : the length to use.

Keyword argument

  • type::Symbol=relative specify the type of constant step size.
    • :relative – scale the gradient tangent vector $X$ to $s*X$
    • :absolute – scale the gradient to an absolute step length $s$, that is $\frac{s}{\lVert X \rVert_{}}X$
Info

This function generates a ManifoldDefaultsFactory for ConstantStepsize. For default values, that depend on the manifold, this factory postpones the construction until the manifold from for example a corresponding AbstractManoptSolverState is available.

source
Manopt.DecreasingLengthFunction
DegreasingLength(; kwargs...)
DecreasingLength(M::AbstractManifold; kwargs...)

Specify a [Stepsize] that is decreasing as ``s_k = \frac{(l - ak)f^i}{(k+s)^e} with the following

Keyword arguments

  • exponent=1.0: the exponent $e$ in the denominator
  • factor=1.0: the factor $f$ in the nominator
  • length=min(injectivity_radius(M)/2, 1.0): the initial step size $l$.
  • subtrahend=0.0: a value $a$ that is subtracted every iteration
  • shift=0.0: shift the denominator iterator $k$ by $s$.
  • type::Symbol=relative specify the type of constant step size.
  • :relative – scale the gradient tangent vector $X$ to $s_k*X$
  • :absolute – scale the gradient to an absolute step length $s_k$, that is $\frac{s_k}{\lVert X \rVert_{}}X$
Info

This function generates a ManifoldDefaultsFactory for DecreasingStepsize. For default values, that depend on the manifold, this factory postpones the construction until the manifold from for example a corresponding AbstractManoptSolverState is available.

source
Manopt.NonmonotoneLinesearchFunction
NonmonotoneLinesearch(; kwargs...)
NonmonotoneLinesearch(M::AbstractManifold; kwargs...)

A functor representing a nonmonotone line search using the Barzilai-Borwein step size [IP17].

This method first computes

(x -> p, F-> f)

\[y_{k} = \operatorname{grad}f(p_{k}) - \mathcal T_{p_k←p_{k-1}}\operatorname{grad}f(p_{k-1})\]

and

\[s_{k} = - α_{k-1} ⋅ \mathcal T_{p_k←p_{k-1}}\operatorname{grad}f(p_{k-1}),\]

where $α_{k-1}$ is the step size computed in the last iteration and $\mathcal T_{⋅←⋅}$ is a vector transport. Then the Barzilai—Borwein step size is

\[α_k^{\text{BB}} = \begin{cases} \min(α_{\text{max}}, \max(α_{\text{min}}, τ_{k})), & \text{if} ⟨s_{k}, y_{k}⟩_{p_k} > 0,\\ α_{\text{max}}, & \text{else,} \end{cases}\]

where

\[τ_{k} = \frac{⟨s_{k}, s_{k}⟩_{p_k}}{⟨s_{k}, y_{k}⟩_{p_k}},\]

if the direct strategy is chosen, or

\[τ_{k} = \frac{⟨s_{k}, y_{k}⟩_{p_k}}{⟨y_{k}, y_{k}⟩_{p_k}},\]

in case of the inverse strategy or an alternation between the two in cases for the alternating strategy. Then find the smallest $h = 0, 1, 2, …$ such that

\[f(\operatorname{retr}_{p_k}(- σ^h α_k^{\text{BB}} \operatorname{grad}f(p_k))) ≤ \max_{1 ≤ j ≤ \max(k+1,m)} f(p_{k+1-j}) - γ σ^h α_k^{\text{BB}} ⟨\operatorname{grad}F(p_k), \operatorname{grad}F(p_k)⟩_{p_k},\]

where $σ ∈ (0,1)$ is a step length reduction factor , $m$ is the number of iterations after which the function value has to be lower than the current one and $γ ∈ (0,1)$ is the sufficient decrease parameter. Finally the step size is computed as

\[α_k = σ^h α_k^{\text{BB}}.\]

Keyword arguments

  • p=rand(M): a point on the manifold $\mathcal M$to store an interim result
  • p=allocate_result(M, rand): to store an interim result
  • initial_stepsize=1.0: the step size to start the search with
  • memory_size=10: number of iterations after which the cost value needs to be lower than the current one
  • bb_min_stepsize=1e-3: lower bound for the Barzilai-Borwein step size greater than zero
  • bb_max_stepsize=1e3: upper bound for the Barzilai-Borwein step size greater than min_stepsize
  • retraction_method=default_retraction_method(M, typeof(p)): a retraction $\operatorname{retr}$ to use, see the section on retractions
  • strategy=direct: defines if the new step size is computed using the :direct, :indirect or :alternating strategy
  • storage=StoreStateAction(M; store_fields=[:Iterate, :Gradient]): increase efficiency by using a StoreStateAction for :Iterate and :Gradient.
  • stepsize_reduction=0.5: step size reduction factor contained in the interval $(0,1)$
  • sufficient_decrease=1e-4: sufficient decrease parameter contained in the interval $(0,1)$
  • stop_when_stepsize_less=0.0: smallest stepsize when to stop (the last one before is taken)
  • stop_when_stepsize_exceeds=max_stepsize(M, p)): largest stepsize when to stop to avoid leaving the injectivity radius
  • stop_increasing_at_step=100: last step to increase the stepsize (phase 1),
  • stop_decreasing_at_step=1000: last step size to decrease the stepsize (phase 2),
source
Manopt.PolyakFunction
Polyak(; kwargs...)
Polyak(M::AbstractManifold; kwargs...)

Compute a step size according to a method propsed by Polyak, cf. the Dynamic step size discussed in Section 3.2 of [Ber15]. This has been generalised here to both the Riemannian case and to approximate the minimum cost value.

Let $f_{\text{best}$ be the best cost value seen until now during some iterative optimisation algorithm and let $γ_k$ be a sequence of numbers that is square summable, but not summable.

Then the step size computed here reads

\[s_k = \frac{f(p^{(k)}) - f_{\text{best} + γ_k}{\lVert ∂f(p^{(k)})} \rVert_{}},\]

where $∂f$ denotes a nonzero-subgradient of $f$ at the current iterate $p^{(k)}$.

Constructor

Polyak(; γ = k -> 1/k, initial_cost_estimate=0.0)

initialize the Polyak stepsize to a certain sequence and an initial estimate of $f_{ ext{best}}$.

Info

This function generates a ManifoldDefaultsFactory for PolyakStepsize. For default values, that depend on the manifold, this factory postpones the construction until the manifold from for example a corresponding AbstractManoptSolverState is available.

source
Manopt.WolfePowellLinesearchFunction
WolfePowellLinesearch(; kwargs...)
WolfePowellLinesearch(M::AbstractManifold; kwargs...)

Perform a lineseach to fulfull both the Armijo-Goldstein conditions

\[f\bigl( \operatorname{retr}_{p}(αX) \bigr) ≤ f(p) + c_1 α_k ⟨\operatorname{grad} f(p), X⟩_{p}\]

as well as the Wolfe conditions

\[\frac{\mathrm{d}}{\mathrm{d}t} f\bigl(\operatorname{retr}_{p}(tX)\bigr) \Big\vert_{t=α} ≥ c_2 \frac{\mathrm{d}}{\mathrm{d}t} f\bigl(\operatorname{retr}_{p}(tX)\bigr)\Big\vert_{t=0}.\]

for some given sufficient decrease coefficient $c_1$ and some sufficient curvature condition coefficient$c_2$.

This is adopted from [NW06, Section 3.1]

Keyword arguments

  • sufficient_decrease=10^(-4)
  • sufficient_curvature=0.999
  • p::P: a point on the manifold $\mathcal M$as temporary storage for candidates
  • X::T: a tangent vector at the point $p$ on the manifold $\mathcal M$as type of memory allocated for the candidates direction and tangent
  • max_stepsize=max_stepsize(M, p): largest stepsize allowed here.
  • retraction_method=default_retraction_method(M, typeof(p)): a retraction $\operatorname{retr}$ to use, see the section on retractions
  • stop_when_stepsize_less=0.0: smallest stepsize when to stop (the last one before is taken)
  • vector_transport_method=default_vector_transport_method(M, typeof(p)): a vector transport $\mathcal T_{⋅←⋅}$ to use, see the section on vector transports
source
Manopt.WolfePowellBinaryLinesearchFunction
WolfePowellBinaryLinesearch(; kwargs...)
WolfePowellBinaryLinesearch(M::AbstractManifold; kwargs...)

Perform a lineseach to fulfull both the Armijo-Goldstein conditions for some given sufficient decrease coefficient $c_1$ and some sufficient curvature condition coefficient$c_2$. Compared to WolfePowellLinesearch which tries a simpler method, this linesearch performs the following algorithm

With

\[A(t) = f(p_+) ≤ c_1 t ⟨\operatorname{grad}f(p), X⟩_{x} \quad\text{ and }\quad W(t) = ⟨\operatorname{grad}f(x_+), \mathcal T_{p_+←p}X⟩_{p_+} ≥ c_2 ⟨X, \operatorname{grad}f(x)⟩_x,\]

where $p_+ =\operatorname{retr}_p(tX)$ is the current trial point, and $\mathcal T_{⋅←⋅}$ denotes a vector transport. Then the following Algorithm is performed similar to Algorithm 7 from [Hua14]

  1. set $α=0$, $β=∞$ and $t=1$.
  2. While either $A(t)$ does not hold or $W(t)$ does not hold do steps 3-5.
  3. If $A(t)$ fails, set $β=t$.
  4. If $A(t)$ holds but $W(t)$ fails, set $α=t$.
  5. If $β<∞$ set $t=\frac{α+β}{2}$, otherwise set $t=2α$.

Keyword arguments

source

Some step sizes use max_stepsize function as a rough upper estimate for the trust region size. It is by default equal to injectivity radius of the exponential map but in some cases a different value is used. For the FixedRankMatrices manifold an estimate from Manopt is used. Tangent bundle with the Sasaki metric has 0 injectivity radius, so the maximum stepsize of the underlying manifold is used instead. Hyperrectangle also has 0 injectivity radius and an estimate based on maximum of dimensions along each index is used instead. For manifolds with corners, however, a line search capable of handling break points along the projected search direction should be used, and such algorithms do not call max_stepsize.

Internally these step size functions create a ManifoldDefaultsFactory. Internally these use

Manopt.armijo_initial_guessMethod
armijo_initial_guess(mp::AbstractManoptProblem, s::AbstractManoptSolverState, k, l)

Input

Return an initial guess for the ArmijoLinesearchStepsize.

The default provided is based on the max_stepsize(M), which we denote by $m$. Let further $X$ be the current descent direction with norm $n=\lVert X \rVert_{p}$ its length. Then this (default) initial guess returns

  • $l$ if $m$ is not finite
  • $\min(l, \frac{m}{n})$ otherwise

This ensures that the initial guess does not yield to large (initial) steps.

source
Manopt.get_last_stepsizeMethod
get_last_stepsize(amp::AbstractManoptProblem, ams::AbstractManoptSolverState, vars...)

return the last computed stepsize stored within AbstractManoptSolverState ams when solving the AbstractManoptProblem amp.

This method takes into account that ams might be decorated. In case this returns NaN, a concrete call to the stored stepsize is performed. For this, usually, the first of the vars... should be the current iterate.

source
Manopt.get_last_stepsizeMethod
get_last_stepsize(::Stepsize, vars...)

return the last computed stepsize from within the stepsize. If no last step size is stored, this returns NaN.

source
Manopt.linesearch_backtrackMethod
(s, msg) = linesearch_backtrack(M, F, p, X, s, decrease, contract η = -X, f0 = f(p); kwargs...)
(s, msg) = linesearch_backtrack!(M, q, F, p, X, s, decrease, contract η = -X, f0 = f(p); kwargs...)

perform a line search

  • on manifold M
  • for the cost function f,
  • at the current point p
  • with current gradient provided in X
  • an initial stepsize s
  • a sufficient decrease
  • a contraction factor $σ$
  • a search direction $η = -X$
  • an offset, $f_0 = F(x)$

Keyword arguments

  • retraction_method=default_retraction_method(M, typeof(p)): a retraction $\operatorname{retr}$ to use, see the section on retractions
  • stop_when_stepsize_less=0.0: to avoid numerical underflow
  • stop_when_stepsize_exceeds=max_stepsize(M, p) / norm(M, p, η)) to avoid leaving the injectivity radius on a manifold
  • stop_increasing_at_step=100: stop the initial increase of step size after these many steps
  • stop_decreasing_at_step=1000`: stop the decreasing search after these many steps
  • additional_increase_condition=(M,p) -> true: impose an additional condition for an increased step size to be accepted
  • additional_decrease_condition=(M,p) -> true: impose an additional condition for an decreased step size to be accepted

These keywords are used as safeguards, where only the max stepsize is a very manifold specific one.

Return value

A stepsize s and a message msg (in case any of the 4 criteria hit)

source
Manopt.max_stepsizeMethod
max_stepsize(M::AbstractManifold, p)
max_stepsize(M::AbstractManifold)

Get the maximum stepsize (at point p) on manifold M. It should be used to limit the distance an algorithm is trying to move in a single step.

By default, this returns injectivity_radius(M), if this exists. If this is not available on the the method returns Inf.

source
Manopt.AdaptiveWNGradientStepsizeType
AdaptiveWNGradientStepsize{I<:Integer,R<:Real,F<:Function} <: Stepsize

A functor problem, state, k, X) -> s to an adaptive gradient method introduced by [GrapigliaStella:2023](@cite). See [AdaptiveWNGradient`](@ref) for the mathematical details.

Fields

  • count_threshold::I: an Integer for $\hat{c}$
  • minimal_bound::R: the value for $b_{\text{min}}$
  • alternate_bound::F: how to determine $\hat{k}_k$ as a function of (bmin, bk, hat_c) -> hat_bk
  • gradient_reduction::R: the gradient reduction factor threshold $α ∈ [0,1)$
  • gradient_bound::R: the bound $b_k$.
  • weight::R: $ω_k$ initialised to $ω_0 =$norm(M, p, X) if this is not zero, 1.0 otherwise.
  • count::I: $c_k$, initialised to $c_0 = 0$.

Constructor

AdaptiveWNGrad(M::AbstractManifold; kwargs...)

Keyword arguments

  • adaptive=true: switches the gradient_reductionα(iftrue) to0`.
  • alternate_bound = (bk, hat_c) -> min(gradient_bound == 0 ? 1.0 : gradient_bound, max(minimal_bound, bk / (3 * hat_c))
  • count_threshold=4
  • gradient_reduction::R=adaptive ? 0.9 : 0.0
  • gradient_bound=norm(M, p, X)
  • minimal_bound=1e-4
  • p=rand(M): a point on the manifold $\mathcal M$only used to define the gradient_bound
  • X=zero_vector(M, p): a tangent vector at the point $p$ on the manifold $\mathcal M$only used to define the gradient_bound
source
Manopt.ArmijoLinesearchStepsizeType
ArmijoLinesearchStepsize <: Linesearch

A functor problem, state, k, X) -> s to provide an Armijo line search to compute step size, based on the search directionX`

Fields

  • candidate_point: to store an interim result
  • initial_stepsize: and initial step size
  • retraction_method=default_retraction_method(M, typeof(p)): a retraction $\operatorname{retr}$ to use, see the section on retractions
  • contraction_factor: exponent for line search reduction
  • sufficient_decrease: gain within Armijo's rule
  • last_stepsize: the last step size to start the search with
  • initial_guess: a function to provide an initial guess for the step size, it maps (m,p,k,l) -> α based on a AbstractManoptProblem p, AbstractManoptSolverState s, the current iterate k and a last step size l. It returns the initial guess α.
  • additional_decrease_condition: specify a condition a new point has to additionally fulfill. The default accepts all points.
  • additional_increase_condition: specify a condtion that additionally to checking a valid increase has to be fulfilled. The default accepts all points.
  • stop_when_stepsize_less: smallest stepsize when to stop (the last one before is taken)
  • stop_when_stepsize_exceeds: largest stepsize when to stop.
  • stop_increasing_at_step: last step to increase the stepsize (phase 1),
  • stop_decreasing_at_step: last step size to decrease the stepsize (phase 2),

Pass :Messages to a debug= to see @infos when these happen.

Constructor

ArmijoLinesearchStepsize(M::AbstractManifold; kwarg...)

with the fields keyword arguments and the retraction is set to the default retraction on M.

Keyword arguments

  • candidate_point=(allocate_result(M, rand))
  • initial_stepsize=1.0
  • retraction_method=default_retraction_method(M, typeof(p)): a retraction $\operatorname{retr}$ to use, see the section on retractions
  • contraction_factor=0.95
  • sufficient_decrease=0.1
  • last_stepsize=initialstepsize
  • initial_guess=armijo_initial_guess– (p,s,i,l) -> l
  • stop_when_stepsize_less=0.0: stop when the stepsize decreased below this version.
  • stop_when_stepsize_exceeds=[max_step](@ref)(M)`: provide an absolute maximal step size.
  • stop_increasing_at_step=100: for the initial increase test, stop after these many steps
  • stop_decreasing_at_step=1000: in the backtrack, stop after these many steps
source
Manopt.ConstantStepsizeType
ConstantStepsize <: Stepsize

A functor (problem, state, ...) -> s to provide a constant step size s.

Fields

  • length: constant value for the step size
  • type: a symbol that indicates whether the stepsize is relatively (:relative), with respect to the gradient norm, or absolutely (:absolute) constant.

Constructors

ConstantStepsize(s::Real, t::Symbol=:relative)

initialize the stepsize to a constant s of type t.

ConstantStepsize(
    M::AbstractManifold=DefaultManifold(),
    s=min(1.0, injectivity_radius(M)/2);
    type::Symbol=:relative
)
source
Manopt.DecreasingStepsizeType
DecreasingStepsize()

A functor (problem, state, ...) -> s to provide a constant step size s.

Fields

  • exponent: a value $e$ the current iteration numbers $e$th exponential is taken of
  • factor: a value $f$ to multiply the initial step size with every iteration
  • length: the initial step size $l$.
  • subtrahend: a value $a$ that is subtracted every iteration
  • shift: shift the denominator iterator $i$ by $s$`.
  • type: a symbol that indicates whether the stepsize is relatively (:relative), with respect to the gradient norm, or absolutely (:absolute) constant.

In total the complete formulae reads for the $i$th iterate as

\[s_i = \frac{(l - i a)f^i}{(i+s)^e}\]

and hence the default simplifies to just $s_i = rac{l}{i}$

Constructor

DecreasingStepsize(M::AbstractManifold;
    length=min(injectivity_radius(M)/2, 1.0),
    factor=1.0,
    subtrahend=0.0,
    exponent=1.0,
    shift=0.0,
    type=:relative,
)

initializes all fields, where none of them is mandatory and the length is set to half and to $1$ if the injectivity radius is infinite.

source
Manopt.LinesearchType
Linesearch <: Stepsize

An abstract functor to represent line search type step size determinations, see Stepsize for details. One example is the ArmijoLinesearchStepsize functor.

Compared to simple step sizes, the line search functors provide an interface of the form (p,o,i,X) -> s with an additional (but optional) fourth parameter to provide a search direction; this should default to something reasonable, most prominently the negative gradient.

source
Manopt.NonmonotoneLinesearchStepsizeType
NonmonotoneLinesearchStepsize{P,T,R<:Real} <: Linesearch

A functor representing a nonmonotone line search using the Barzilai-Borwein step size [IP17].

Fields

  • initial_stepsize=1.0: the step size to start the search with
  • memory_size=10: number of iterations after which the cost value needs to be lower than the current one
  • bb_min_stepsize=1e-3: lower bound for the Barzilai-Borwein step size greater than zero
  • bb_max_stepsize=1e3: upper bound for the Barzilai-Borwein step size greater than min_stepsize
  • retraction_method=default_retraction_method(M, typeof(p)): a retraction $\operatorname{retr}$ to use, see the section on retractions
  • strategy=direct: defines if the new step size is computed using the :direct, :indirect or :alternating strategy
  • storage: (for :Iterate and :Gradient) a StoreStateAction
  • stepsize_reduction: step size reduction factor contained in the interval (0,1)
  • sufficient_decrease: sufficient decrease parameter contained in the interval (0,1)
  • vector_transport_method=default_vector_transport_method(M, typeof(p)): a vector transport $\mathcal T_{⋅←⋅}$ to use, see the section on vector transports
  • candidate_point: to store an interim result
  • stop_when_stepsize_less: smallest stepsize when to stop (the last one before is taken)
  • stop_when_stepsize_exceeds: largest stepsize when to stop.
  • stop_increasing_at_step: last step to increase the stepsize (phase 1),
  • stop_decreasing_at_step: last step size to decrease the stepsize (phase 2),

Constructor

NonmonotoneLinesearchStepsize(M::AbstractManifold; kwargs...)

Keyword arguments

  • p=allocate_result(M, rand): to store an interim result
  • initial_stepsize=1.0
  • memory_size=10
  • bb_min_stepsize=1e-3
  • bb_max_stepsize=1e3
  • retraction_method=default_retraction_method(M, typeof(p)): a retraction $\operatorname{retr}$ to use, see the section on retractions
  • strategy=direct
  • storage=[StoreStateAction](@ref)(M; store_fields=[:Iterate, :Gradient])``
  • stepsize_reduction=0.5
  • sufficient_decrease=1e-4
  • stop_when_stepsize_less=0.0
  • stop_when_stepsize_exceeds=max_stepsize(M, p))
  • stop_increasing_at_step=100
  • stop_decreasing_at_step=1000
  • vector_transport_method=default_vector_transport_method(M, typeof(p)): a vector transport $\mathcal T_{⋅←⋅}$ to use, see the section on vector transports
source
Manopt.PolyakStepsizeType
PolyakStepsize <: Stepsize

A functor (problem, state, ...) -> s to provide a step size due to Polyak, cf. Section 3.2 of [Ber15].

Fields

  • γ : a function k -> ... representing a seuqnce.
  • best_cost_value : storing the best cost value

Constructor

PolyakStepsize(;
    γ = i -> 1/i,
    initial_cost_estimate=0.0
)

Construct a stepsize of Polyak type.

See also

Polyak

source
Manopt.WolfePowellBinaryLinesearchStepsizeType
WolfePowellBinaryLinesearchStepsize{R} <: Linesearch

Do a backtracking line search to find a step size $α$ that fulfils the Wolfe conditions along a search direction $X$ starting from $p$. See WolfePowellBinaryLinesearch for the math details.

Fields

  • sufficient_decrease::R, sufficient_curvature::R two constants in the line search
  • last_stepsize::R
  • max_stepsize::R
  • retraction_method::AbstractRetractionMethod: a retraction $\operatorname{retr}$ to use, see the section on retractions
  • stop_when_stepsize_less::R: a safeguard to stop when the stepsize gets too small
  • vector_transport_method::AbstractVectorTransportMethodP: a vector transport $\mathcal T_{⋅←⋅}$ to use, see the section on vector transports

Keyword arguments

source
Manopt.WolfePowellLinesearchStepsizeType
WolfePowellLinesearchStepsize{R<:Real} <: Linesearch

Do a backtracking line search to find a step size $α$ that fulfils the Wolfe conditions along a search direction $X$ starting from $p$. See WolfePowellLinesearch for the math details

Fields

  • sufficient_decrease::R, sufficient_curvature::R two constants in the line search
  • candidate_direction::T: a tangent vector at the point $p$ on the manifold $\mathcal M$
  • candidate_point::P: a point on the manifold $\mathcal M$as temporary storage for candidates
  • candidate_tangent::T: a tangent vector at the point $p$ on the manifold $\mathcal M$
  • last_stepsize::R
  • max_stepsize::R
  • retraction_method::AbstractRetractionMethod: a retraction $\operatorname{retr}$ to use, see the section on retractions
  • stop_when_stepsize_less::R: a safeguard to stop when the stepsize gets too small
  • vector_transport_method::AbstractVectorTransportMethodP: a vector transport $\mathcal T_{⋅←⋅}$ to use, see the section on vector transports

Keyword arguments

  • sufficient_decrease=10^(-4)
  • sufficient_curvature=0.999
  • p::P: a point on the manifold $\mathcal M$as temporary storage for candidates
  • X::T: a tangent vector at the point $p$ on the manifold $\mathcal M$as type of memory allocated for the candidates direction and tangent
  • max_stepsize=max_stepsize(M, p): largest stepsize allowed here.
  • retraction_method=default_retraction_method(M, typeof(p)): a retraction $\operatorname{retr}$ to use, see the section on retractions
  • stop_when_stepsize_less=0.0: smallest stepsize when to stop (the last one before is taken)
  • vector_transport_method=default_vector_transport_method(M, typeof(p)): a vector transport $\mathcal T_{⋅←⋅}$ to use, see the section on vector transports
source

Some solvers have a different iterate from the one used for the line search. Then the following state can be used to wrap these locally

Manopt.StepsizeStateType
StepsizeState{P,T} <: AbstractManoptSolverState

A state to store a point and a descent direction used within a linesearch, if these are different from the iterate and search direction of the main solver.

Fields

  • p::P: a point on a manifold
  • X::T: a tangent vector at p.

Constructor

StepsizeState(p,X)
StepsizeState(M::AbstractManifold; p=rand(M), x=zero_vector(M,p)

See also

interior_point_Newton

source

Literature

[Ber15]
D. P. Bertsekas. Convex Optimization Algorithms (Athena Scientific, 2015); p. 576.
[Bou23]
[GS23]
[Hua14]
W. Huang. Optimization algorithms on Riemannian manifolds with applications. Ph.D. Thesis, Flordia State University (2014).
[IP17]
B. Iannazzo and M. Porcelli. The Riemannian Barzilai–Borwein method with nonmonotone line search and the matrix geometric mean computation. IMA Journal of Numerical Analysis 38, 495–517 (2017).
[NW06]
J. Nocedal and S. J. Wright. Numerical Optimization. 2 Edition (Springer, New York, 2006).