Extensions

LineSearches.jl

Manopt can be used with line search algorithms implemented in LineSearches.jl. This can be illustrated by the following example of optimizing Rosenbrock function constrained to the unit sphere.

using Manopt, Manifolds, LineSearches

# define objective function and its gradient
p = [1.0, 100.0]
function rosenbrock(::AbstractManifold, x)
    val = zero(eltype(x))
    for i in 1:(length(x) - 1)
        val += (p[1] - x[i])^2 + p[2] * (x[i + 1] - x[i]^2)^2
    end
    return val
end
function rosenbrock_grad!(M::AbstractManifold, storage, x)
    storage .= 0.0
    for i in 1:(length(x) - 1)
        storage[i] += -2.0 * (p[1] - x[i]) - 4.0 * p[2] * (x[i + 1] - x[i]^2) * x[i]
        storage[i + 1] += 2.0 * p[2] * (x[i + 1] - x[i]^2)
    end
    project!(M, storage, x, storage)
    return storage
end
# define constraint
n_dims = 5
M = Manifolds.Sphere(n_dims)
# set initial point
x0 = vcat(zeros(n_dims - 1), 1.0)
# use LineSearches.jl HagerZhang method with Manopt.jl quasiNewton solver
ls_hz = Manopt.LineSearchesStepsize(M, LineSearches.HagerZhang())
x_opt = quasi_Newton(
    M,
    rosenbrock,
    rosenbrock_grad!,
    x0;
    stepsize=ls_hz,
    evaluation=InplaceEvaluation(),
    stopping_criterion=StopAfterIteration(1000) | StopWhenGradientNormLess(1e-6),
    return_state=true,
)
# Solver state for `Manopt.jl`s Quasi Newton Method
After 30 iterations

## Parameters
* direction update:        limited memory InverseBFGS (size 5), projections, and ParallelTransport() as vector transport.
* retraction method:       ExponentialRetraction()
* vector trnasport method: ParallelTransport()

## Stepsize
LineSearchesStepsize(HagerZhang{Float64, Base.RefValue{Bool}}
  delta: Float64 0.1
  sigma: Float64 0.9
  alphamax: Float64 Inf
  rho: Float64 5.0
  epsilon: Float64 1.0e-6
  gamma: Float64 0.66
  linesearchmax: Int64 50
  psi3: Float64 0.1
  display: Int64 0
  mayterminate: Base.RefValue{Bool}
; retraction_method=ExponentialRetraction(), vector_transport_method=ParallelTransport())

## Stopping Criterion
Stop When _one_ of the following are fulfilled:
    Max Iteration 1000:	not reached
    |grad f| < 1.0e-6: reached
Overall: reached
This indicates convergence: Yes

Manifolds.jl

Manopt.LineSearchesStepsizeType
LineSearchesStepsize <: Stepsize

Wrapper for line searches available in the LineSearches.jl library.

Constructors

LineSearchesStepsize(
    M::AbstractManifold,
    linesearch;
    retraction_method::AbstractRetractionMethod=default_retraction_method(M),
    vector_transport_method::AbstractVectorTransportMethod=default_vector_transport_method(M),
)
LineSearchesStepsize(
    linesearch;
    retraction_method::AbstractRetractionMethod=ExponentialRetraction(),
    vector_transport_method::AbstractVectorTransportMethod=ParallelTransport(),
)

Wrap linesearch (for example HagerZhang or MoreThuente). The initial step selection from Linesearches.jl is not yet supported and the value 1.0 is used. The retraction used for determining the line along which the search is performed can be provided as retraction_method. Gradient vectors are transported between points using vector_transport_method.

source
ManifoldsBase.mid_pointFunction
mid_point(M, p, q, x)
mid_point!(M, y, p, q, x)

Compute the mid point between p and q. If there is more than one mid point of (not neccessarily minimizing) geodesics (e.g. on the sphere), the one nearest to x is returned (in place of y).

source
Manopt.max_stepsizeMethod
max_stepsize(M::TangentBundle, p)

Tangent bundle has injectivity radius of either infinity (for flat manifolds) or 0 (for non-flat manifolds). This makes a guess of what a reasonable maximum stepsize on a tangent bundle might be.

source
Manopt.max_stepsizeMethod
max_stepsize(M::FixedRankMatrices, p)

Return a reasonable guess of maximum step size on FixedRankMatrices following the choice of typical distance in Matlab Manopt, i.e. dimension of M. See this note

source