Levenberg-Marquardt

Manopt.LevenbergMarquardtFunction
LevenbergMarquardt(M, f, jacobian_f, p, num_components=-1)

Solve an optimization problem of the form

\[\operatorname{arg\,min}_{p ∈ \mathcal M} \frac{1}{2} \lVert f(p) \rVert^2,\]

where $f: \mathcal M → ℝ^d$ is a continuously differentiable function, using the Riemannian Levenberg-Marquardt algorithm [Pee93]. The implementation follows Algorithm 1 [AOT22]

Input

  • M: a manifold $\mathcal M$
  • f: a cost function $f: \mathcal M→ℝ^d$
  • jacobian_f: the Jacobian of $f$. The Jacobian is supposed to accept a keyword argument basis_domain which specifies basis of the tangent space at a given point in which the Jacobian is to be calculated. By default it should be the DefaultOrthonormalBasis.
  • p: an initial value $p ∈ \mathcal M$
  • num_components: length of the vector returned by the cost function (d). By default its value is -1 which means that it is determined automatically by calling f one additional time. This is only possible when evaluation is AllocatingEvaluation, for mutating evaluation this value must be explicitly specified.

These can also be passed as a NonlinearLeastSquaresObjective, then the keyword jacobian_tangent_basis below is ignored

Optional

  • evaluation: (AllocatingEvaluation) specify whether the gradient works by allocation (default) form gradF(M, x) or InplaceEvaluation in place of the form gradF!(M, X, x).
  • retraction_method: (default_retraction_method(M, typeof(p))) a retraction(M,x,ξ) to use.
  • stopping_criterion: (StopAfterIteration(200) |StopWhenGradientNormLess(1e-12)) a functor inheriting from StoppingCriterion indicating when to stop.
  • expect_zero_residual: (false) whether or not the algorithm might expect that the value of residual (objective) at minimum is equal to 0.
  • η: scaling factor for the sufficient cost decrease threshold required to accept new proposal points. Allowed range: 0 < η < 1.
  • damping_term_min: initial (and also minimal) value of the damping term
  • β: parameter by which the damping term is multiplied when the current new point is rejected
  • initial_residual_values: the initial residual vector of the cost function f.
  • initial_jacobian_f: the initial Jacobian of the cost function f.
  • jacobian_tangent_basis: an AbstractBasis specify the basis of the tangent space for jacobian_f.

All other keyword arguments are passed to decorate_state! for decorators or decorate_objective!, respectively. If you provide the ManifoldGradientObjective directly, these decorations can still be specified

Output

the obtained (approximate) minimizer $p^*$, see get_solver_return for details

References

source

Options

Manopt.LevenbergMarquardtStateType
LevenbergMarquardtState{P,T} <: AbstractGradientSolverState

Describes a Gradient based descent algorithm, with

Fields

A default value is given in brackets if a parameter can be left out in initialization.

  • x: a point (of type P) on a manifold as starting point
  • stop: (StopAfterIteration(200) | StopWhenGradientNormLess(1e-12) | StopWhenStepsizeLess(1e-12)) a StoppingCriterion
  • retraction_method: (default_retraction_method(M, typeof(p))) the retraction to use, defaults to the default set for your manifold.
  • residual_values value of $F$ calculated in the solver setup or the previous iteration
  • residual_values_temp value of $F$ for the current proposal point
  • jacF the current Jacobian of $F$
  • gradient the current gradient of $F$
  • step_vector the tangent vector at x that is used to move to the next point
  • last_stepsize length of step_vector
  • η Scaling factor for the sufficient cost decrease threshold required to accept new proposal points. Allowed range: 0 < η < 1.
  • damping_term current value of the damping term
  • damping_term_min initial (and also minimal) value of the damping term
  • β parameter by which the damping term is multiplied when the current new point is rejected
  • expect_zero_residual: (false) if true, the algorithm expects that the value of the residual (objective) at minimum is equal to 0.

Constructor

LevenbergMarquardtState(M, initialX, initial_residual_values, initial_jacF; initial_vector), kwargs...)

Generate Levenberg-Marquardt options.

See also

gradient_descent, LevenbergMarquardt

source

Technical details

The LevenbergMarquardt solver requires the following functions of a manifold to be available

  • A retract!(M, q, p, X); it is recommended to set the default_retraction_method to a favourite retraction. If this default is set, a retraction_method= does not have to be specified.
  • the norm as well, to stop when the norm of the gradient is small, but if you implemented inner, the norm is provided already.
  • A `copyto!(M, q, p) and copy(M,p) for points.

Literature

[AOT22]
S. Adachi, T. Okuno and A. Takeda. Riemannian Levenberg-Marquardt Method with Global and Local Convergence Properties. ArXiv Preprint (2022).
[Pee93]
R. Peeters. On a Riemannian version of the Levenberg-Marquardt algorithm. Serie Research Memoranda 0011 (VU University Amsterdam, Faculty of Economics, Business Administration and Econometrics, 1993).