# Levenberg-Marquardt

Manopt.LevenbergMarquardtFunction
LevenbergMarquardt(M, f, jacobian_f, p, num_components=-1)
LevenbergMarquardt!(M, f, jacobian_f, p, num_components=-1; kwargs...)

Solve an optimization problem of the form

$$$\operatorname*{arg\,min}_{p ∈ \mathcal M} \frac{1}{2} \lVert f(p) \rVert^2,$$$

where $f: \mathcal M → ℝ^d$ is a continuously differentiable function, using the Riemannian Levenberg-Marquardt algorithm [Pee93]. The implementation follows Algorithm 1 [AOT22]. The second signature performs the optimization in-place of p.

Input

• M::AbstractManifold: a Riemannian manifold $\mathcal M$
• f: a cost function $f: \mathcal M→ℝ^d$
• jacobian_f: the Jacobian of $f$. The Jacobian is supposed to accept a keyword argument basis_domain which specifies basis of the tangent space at a given point in which the Jacobian is to be calculated. By default it should be the DefaultOrthonormalBasis.
• p: a point on the manifold $\mathcal M$
• num_components: length of the vector returned by the cost function (d). By default its value is -1 which means that it is determined automatically by calling f one additional time. This is only possible when evaluation is AllocatingEvaluation, for mutating evaluation this value must be explicitly specified.

These can also be passed as a NonlinearLeastSquaresObjective, then the keyword jacobian_tangent_basis below is ignored

Keyword arguments

• evaluation=AllocatingEvaluation(): specify whether the functions that return an array, for example a point or a tangent vector, work by allocating its result (AllocatingEvaluation) or whether they modify their input argument to return the result therein (InplaceEvaluation). Since usually the first argument is the manifold, the modified argument is the second.
• η=0.2: scaling factor for the sufficient cost decrease threshold required to accept new proposal points. Allowed range: 0 < η < 1.
• expect_zero_residual=false: whether or not the algorithm might expect that the value of residual (objective) at minimum is equal to 0.
• damping_term_min=0.1: initial (and also minimal) value of the damping term
• β=5.0: parameter by which the damping term is multiplied when the current new point is rejected
• initial_jacobian_f: the initial Jacobian of the cost function f. By default this is a matrix of size num_components times the manifold dimension of similar type as p.
• initial_residual_values: the initial residual vector of the cost function f. By default this is a vector of length num_components of similar type as p.
• jacobian_tangent_basis: an AbstractBasis specify the basis of the tangent space for jacobian_f.
• retraction_method=default_retraction_method(M, typeof(p)): a retraction $\operatorname{retr}$ to use, see the section on retractions
• stopping_criterion=StopAfterIteration(200)|StopWhenGradientNormLess(1e-12): a functor indicating that the stopping criterion is fulfilled

All other keyword arguments are passed to decorate_state! for state decorators or decorate_objective! for objective decorators, respectively.

Output

The obtained approximate minimizer $p^*$. To obtain the whole final state of the solver, see get_solver_return for details, especially the return_state= keyword.

source
Manopt.LevenbergMarquardt!Function
LevenbergMarquardt(M, f, jacobian_f, p, num_components=-1)
LevenbergMarquardt!(M, f, jacobian_f, p, num_components=-1; kwargs...)

Solve an optimization problem of the form

$$$\operatorname*{arg\,min}_{p ∈ \mathcal M} \frac{1}{2} \lVert f(p) \rVert^2,$$$

where $f: \mathcal M → ℝ^d$ is a continuously differentiable function, using the Riemannian Levenberg-Marquardt algorithm [Pee93]. The implementation follows Algorithm 1 [AOT22]. The second signature performs the optimization in-place of p.

Input

• M::AbstractManifold: a Riemannian manifold $\mathcal M$
• f: a cost function $f: \mathcal M→ℝ^d$
• jacobian_f: the Jacobian of $f$. The Jacobian is supposed to accept a keyword argument basis_domain which specifies basis of the tangent space at a given point in which the Jacobian is to be calculated. By default it should be the DefaultOrthonormalBasis.
• p: a point on the manifold $\mathcal M$
• num_components: length of the vector returned by the cost function (d). By default its value is -1 which means that it is determined automatically by calling f one additional time. This is only possible when evaluation is AllocatingEvaluation, for mutating evaluation this value must be explicitly specified.

These can also be passed as a NonlinearLeastSquaresObjective, then the keyword jacobian_tangent_basis below is ignored

Keyword arguments

• evaluation=AllocatingEvaluation(): specify whether the functions that return an array, for example a point or a tangent vector, work by allocating its result (AllocatingEvaluation) or whether they modify their input argument to return the result therein (InplaceEvaluation). Since usually the first argument is the manifold, the modified argument is the second.
• η=0.2: scaling factor for the sufficient cost decrease threshold required to accept new proposal points. Allowed range: 0 < η < 1.
• expect_zero_residual=false: whether or not the algorithm might expect that the value of residual (objective) at minimum is equal to 0.
• damping_term_min=0.1: initial (and also minimal) value of the damping term
• β=5.0: parameter by which the damping term is multiplied when the current new point is rejected
• initial_jacobian_f: the initial Jacobian of the cost function f. By default this is a matrix of size num_components times the manifold dimension of similar type as p.
• initial_residual_values: the initial residual vector of the cost function f. By default this is a vector of length num_components of similar type as p.
• jacobian_tangent_basis: an AbstractBasis specify the basis of the tangent space for jacobian_f.
• retraction_method=default_retraction_method(M, typeof(p)): a retraction $\operatorname{retr}$ to use, see the section on retractions
• stopping_criterion=StopAfterIteration(200)|StopWhenGradientNormLess(1e-12): a functor indicating that the stopping criterion is fulfilled

All other keyword arguments are passed to decorate_state! for state decorators or decorate_objective! for objective decorators, respectively.

Output

The obtained approximate minimizer $p^*$. To obtain the whole final state of the solver, see get_solver_return for details, especially the return_state= keyword.

source

## Options

Manopt.LevenbergMarquardtStateType
LevenbergMarquardtState{P,T} <: AbstractGradientSolverState

Describes a Gradient based descent algorithm, with

Fields

A default value is given in brackets if a parameter can be left out in initialization.

• p::P: a point on the manifold $\mathcal M$storing the current iterate
• retraction_method::AbstractRetractionMethod: a retraction $\operatorname{retr}$ to use, see the section on retractions
• residual_values: value of $F$ calculated in the solver setup or the previous iteration
• residual_values_temp: value of $F$ for the current proposal point
• stop::StoppingCriterion: a functor indicating that the stopping criterion is fulfilled
• jacF: the current Jacobian of $F$
• gradient: the current gradient of $F$
• step_vector: the tangent vector at x that is used to move to the next point
• last_stepsize: length of step_vector
• η: Scaling factor for the sufficient cost decrease threshold required to accept new proposal points. Allowed range: 0 < η < 1.
• damping_term: current value of the damping term
• damping_term_min: initial (and also minimal) value of the damping term
• β: parameter by which the damping term is multiplied when the current new point is rejected
• expect_zero_residual: if true, the algorithm expects that the value of the residual (objective) at minimum is equal to 0.

Constructor

LevenbergMarquardtState(M, initial_residual_values, initial_jacF; kwargs...)

Generate the Levenberg-Marquardt solver state.

Keyword arguments

The following fields are keyword arguments

The LevenbergMarquardt solver requires the following functions of a manifold to be available