Stepsize and Linesearch

Most iterative algorithms determine a direction along which the algorithm will proceed and determine a step size to find the next iterate. How advanced the step size computation can be implemented depends (among others) on the properties the corresponding problem provides.

Within Manopt.jl, the step size determination is implemented as a functor based on

Manopt.StepsizeType
Stepsize

An abstract type for the functors representing step sizes, i.e. they are callable structures. The naming scheme is TypeOfStepSize, e.g. ConstantStepsize.

Every Stepsize has to provide a constructor and its function has to have the interface (p,o,i) where a Problem as well as Options and the current number of iterations are the arguments and returns a number, namely the stepsize to use.

See also

Linesearch

source

Usually, a constructor should take the manifold M as its first argument, for consistency, to allow general step size functors to be set up based on default values that might depend on the manifold currently under consideration.

Currently, the following step sizes are available

Manopt.ArmijoLinesearchType
ArmijoLinesearch <: Linesearch

A functor representing Armijo line search including the last runs state, i.e. a last step size.

Fields

  • initial_stepsize – (1.0) and initial step size
  • retraction_method – (default_retraction_method(M)) the rectraction to use
  • contraction_factor – (0.95) exponent for line search reduction
  • sufficient_decrease – (0.1) gain within Armijo's rule
  • last_stepsize – (initialstepsize) the last step size we start the search with
  • linesearch_stopsize - (0.0) a safeguard when to stop the line search before the step is numerically zero. This should be combined with StopWhenStepsizeLess
  • initial_guess ((p,o,i,l) -> l) based on a GradientProblem p, Options o and a current iterate i and a last step size l, this returns an initial guess. The default uses the last obtained stepsize

Constructor

ArmijoLineSearch(M)

with the Fields above as keyword arguments and the retraction is set to the default retraction on M.

The constructors return the functor to perform Armijo line search, where two interfaces are available:

  • based on a tuple (p,o,i) of a GradientProblem p, Options o and a current iterate i.
  • with (M, x, F, gradFx[,η=-gradFx]) -> s where Manifold M, a current point x a function F, that maps from the manifold to the reals, its gradient (a tangent vector) gradFx$=\operatorname{grad}F(x)$ at x and an optional search direction tangent vector η=-gradFx are the arguments.
source
Manopt.ConstantStepsizeType
ConstantStepsize <: Stepsize

A functor that always returns a fixed step size.

Fields

  • length – constant value for the step size.

Constructors

ConstantStepsize(s)

initialize the stepsize to a constant s (deprecated)

ConstantStepsize(M::AbstractManifold=DefaultManifold(2); stepsize=injectivity_radius(M)/2)

initialize the stepsize to a constant stepsize, which by default is half the injectivity radius, unless the radius is infinity, then the default step size is 1.

source
Manopt.DecreasingStepsizeType
DecreasingStepsize()

A functor that represents several decreasing step sizes

Fields

  • length – (1) the initial step size $l$.
  • factor – (1) a value $f$ to multiply the initial step size with every iteration
  • subtrahend – (0) a value $a$ that is subtracted every iteration
  • exponent – (1) a value $e$ the current iteration numbers $e$th exponential is taken of
  • shift – (0) shift the denominator iterator $i$ by $s$`.

In total the complete formulae reads for the $i$th iterate as

\[s_i = \frac{(l - i a)f^i}{(i+s)^e}\]

and hence the default simplifies to just $s_i = \frac{l}{i}$

Constructor

DecreasingStepsize(l=1,f=1,a=0,e=1,s=0)

Alternatively one can also use the following keyword.

DecreasingStepsize(
    M::AbstractManifold=DefaultManifold(3);
    length=injectivity_radius(M)/2, multiplier=1.0, subtrahend=0.0, exponent=1.0, shift=0)

initialiszes all fields above, where none of them is mandatory and the length is set to half and to $1$ if the injectivity radius is infinite.

source
Manopt.LinesearchType
Linesearch <: Stepsize

An abstract functor to represent line search type step size deteminations, see Stepsize for details. One example is the ArmijoLinesearch functor.

Compared to simple step sizes, the linesearch functors provide an interface of the form (p,o,i,η) -> s with an additional (but optional) fourth parameter to provide a search direction; this should default to something reasonable, e.g. the negative gradient.

source
Manopt.NonmonotoneLinesearchType
NonmonotoneLinesearch <: Linesearch

A functor representing a nonmonotone line search using the Barzilai-Borwein step size[Iannazzo2018]. Together with a gradient descent algorithm this line search represents the Riemannian Barzilai-Borwein with nonmonotone line-search (RBBNMLS) algorithm. We shifted the order of the algorithm steps from the paper by Iannazzo and Porcelli so that in each iteration we first find

\[y_{k} = \operatorname{grad}F(x_{k}) - \operatorname{T}_{x_{k-1} → x_k}(\operatorname{grad}F(x_{k-1}))\]

and

\[s_{k} = - α_{k-1} * \operatorname{T}_{x_{k-1} → x_k}(\operatorname{grad}F(x_{k-1})),\]

where $α_{k-1}$ is the step size computed in the last iteration and $\operatorname{T}$ is a vector transport. We then find the Barzilai–Borwein step size

\[α_k^{\text{BB}} = \begin{cases} \min(α_{\text{max}}, \max(α_{\text{min}}, τ_{k})), & \text{if } ⟨s_{k}, y_{k}⟩_{x_k} > 0,\\ α_{\text{max}}, & \text{else,} \end{cases}\]

where

\[τ_{k} = \frac{⟨s_{k}, s_{k}⟩_{x_k}}{⟨s_{k}, y_{k}⟩_{x_k}},\]

if the direct strategy is chosen,

\[τ_{k} = \frac{⟨s_{k}, y_{k}⟩_{x_k}}{⟨y_{k}, y_{k}⟩_{x_k}},\]

in case of the inverse strategy and an alternation between the two in case of the alternating strategy. Then we find the smallest $h = 0, 1, 2, …$ such that

\[F(\operatorname{retr}_{x_k}(- σ^h α_k^{\text{BB}} \operatorname{grad}F(x_k))) \leq \max_{1 ≤ j ≤ \min(k+1,m)} F(x_{k+1-j}) - γ σ^h α_k^{\text{BB}} ⟨\operatorname{grad}F(x_k), \operatorname{grad}F(x_k)⟩_{x_k},\]

where $σ$ is a step length reduction factor $∈ (0,1)$, $m$ is the number of iterations after which the function value has to be lower than the current one and $γ$ is the sufficient decrease parameter $∈(0,1)$. We can then find the new stepsize by

\[α_k = σ^h α_k^{\text{BB}}.\]

Fields

  • initial_stepsize – (1.0) the step size we start the search with
  • linesearch_stopsize - (0.0) a safeguard when to stop the line search before the step is numerically zero. This should be combined with StopWhenStepsizeLess
  • memory_size – (10) number of iterations after which the cost value needs to be lower than the current one
  • min_stepsize – (1e-3) lower bound for the Barzilai-Borwein step size greater than zero
  • max_stepsize – (1e3) upper bound for the Barzilai-Borwein step size greater than min_stepsize
  • retraction_method – (ExponentialRetraction()) the rectraction to use
  • strategy – (direct) defines if the new step size is computed using the direct, indirect or alternating strategy
  • storage – (x, gradient) a StoreOptionsAction to store old_x and old_gradient, the x-value and corresponding gradient of the previous iteration
  • stepsize_reduction – (0.5) step size reduction factor contained in the interval (0,1)
  • sufficient_decrease – (1e-4) sufficient decrease parameter contained in the interval (0,1)
  • vector_transport_method – (ParallelTransport()) the vector transport method to use

Constructor

NonmonotoneLinesearch()

with the Fields above in their order as optional arguments (deprecated).

NonmonotoneLinesearch(M)

with the Fields above in their order as keyword arguments and where the retraction and vector transport are set to the default ones on M, repsectively.

The constructors return the functor to perform nonmonotone line search.

source
Manopt.WolfePowellBinaryLinesearchType
WolfePowellBinaryLinesearch <: Linesearch

A Linesearch method that determines a step size t fulfilling the Wolfe conditions

based on a binary chop. Let $η$ be a search direction and $c1,c_2>0$ be two constants. Then with

\[A(t) = f(x_+) ≤ c1 t ⟨\operatorname{grad}f(x), η⟩_{x} \quad\text{and}\quad W(t) = ⟨\operatorname{grad}f(x_+), \text{V}_{x_+\gets x}η⟩_{x_+} ≥ c_2 ⟨η, \operatorname{grad}f(x)⟩_x,\]

where $x_+ = \operatorname{retr}_x(tη)$ is the current trial point, and $\text{V}$ is a vector transport, we perform the following Algorithm similar to Algorithm 7 from [Huang2014]

  1. set $α=0$, $β=∞$ and $t=1$.
  2. While either $A(t)$ does not hold or $W(t)$ does not hold do steps 3-5.
  3. If $A(t)$ fails, set $β=t$.
  4. If $A(t)$ holds but $W(t)$ fails, set $α=t$.
  5. If $β<∞$ set $t=\frac{α+β}{2}$, otherwise set $t=2α$.

Constructors

There exist two constructors, where, when prodivind the manifold M as a first (optional) parameter, its default retraction and vector transport are the default. In this case the retraction and the vector transport are also keyword arguments for ease of use. The other constructor is kept for backward compatibility. Note that the linesearch_stopsize to stop for too small stepsizes is only available in the new signature including M, for the first it is set to the old default of 1e-9.

WolfePowellBinaryLinesearch(
    retr::AbstractRetractionMethod=ExponentialRetraction(),
    vtr::AbstractVectorTransportMethod=ParallelTransport(),
    c1::Float64=10^(-4),
    c2::Float64=0.999
)

WolfePowellLinesearch(
    M,
    c1::Float64=10^(-4),
    c2::Float64=0.999;
    retraction_method = default_retraction_method(M),
    vector_transport_method = default_vector_transport(M),
    linesearch_stopsize = 0.0
)
source
Manopt.WolfePowellLinesearchType
WolfePowellLinesearch <: Linesearch

Do a backtracking linesearch to find a step size $α$ that fulfils the Wolfe conditions along a search direction $η$ starting from $x$, i.e.

\[f\bigl( \operatorname{retr}_x(αη) \bigr) ≤ f(x_k) + c_1 α_k ⟨\operatorname{grad}f(x), η⟩_x \quad\text{and}\quad \frac{\mathrm{d}}{\mathrm{d}t} f\bigr(\operatorname{retr}_x(tη)\bigr) \Big\vert_{t=α} ≥ c_2 \frac{\mathrm{d}}{\mathrm{d}t} f\bigl(\operatorname{retr}_x(tη)\bigr)\Big\vert_{t=0}.\]

Constructors

There exist two constructors, where, when prodivind the manifold M as a first (optional) parameter, its default retraction and vector transport are the default. In this case the retraction and the vector transport are also keyword arguments for ease of use. The other constructor is kept for backward compatibility. Note that the linesearch_stopsize to stop for too small stepsizes is only available in the new signature including M. For the old (deprecated) signature the linesearch_stopsize is set to the old hard-coded default of 1e-12

WolfePowellLinesearch(
    retr::AbstractRetractionMethod=ExponentialRetraction(),
    vtr::AbstractVectorTransportMethod=ParallelTransport(),
    c1::Float64=10^(-4),
    c2::Float64=0.999
)

WolfePowellLinesearch(
    M,
    c1::Float64=10^(-4),
    c2::Float64=0.999;
    retraction_method = default_retraction_method(M),
    vector_transport_method = default_vector_transport(M),
    linesearch_stopsize = 0.0
)
source
Manopt.get_stepsizeMethod
get_stepsize(p::Problem, o::Options, vars...)

return the stepsize stored within Options o when solving Problem p. This method also works for decorated options and the Stepsize function within the options, by default stored in o.stepsize.

source
Manopt.linesearch_backtrackMethod
linesearch_backtrack(M, F, x, gradFx, s, decrease, contract, retr, η = -gradFx, f0 = F(x); stop_step=0.)

perform a linesearch for

  • a manifold M
  • a cost function F,
  • an iterate x
  • the gradient $\operatorname{grad}F(x)$
  • an initial stepsize s usually called $γ$
  • a sufficient decrease
  • a contraction factor $σ$
  • a retraction, which defaults to the ExponentialRetraction()
  • a search direction $η = -\operatorname{grad}F(x)$
  • an offset, $f_0 = F(x)$
  • a keyword stop_step as a minimal step size when to stop
source
  • Iannazzo2018

    B. Iannazzo, M. Porcelli, The Riemannian Barzilai–Borwein Method with Nonmonotone Line Search and the Matrix Geometric Mean Computation, In: IMA Journal of Numerical Analysis. Volume 38, Issue 1, January 2018, Pages 495–517, doi 10.1093/imanum/drx015

  • Huang2014

    Huang, W.: Optimization algorithms on Riemannian manifolds with applications, Dissertation, Flordia State University, 2014. pdf