# A Manifold Objective

The Objective describes that actual cost function and all its properties.

Manopt.AbstractManifoldObjectiveType
AbstractManifoldObjective{T<:AbstractEvaluationType}

Describe the collection of the optimization function f\colon \mathcal M → \bbR (or even a vectorial range) and its corresponding elements, which might for example be a gradient or (one or more) prxomial maps.

All these elements should usually be implemented as functions (M, p) -> ..., or (M, X, p) -> ... that is

• the first argument of these functions should be the manifold M they are defined on
• the argument X is present, if the computation is performed inplace of X (see InplaceEvaluation)
• the argument p is the place the function ($f$ or one of its elements) is evaluated at.

the type T indicates the global AbstractEvaluationType.

source
Manopt.decorate_objective!Function
decorate_objective!(M, o::AbstractManifoldObjective)

decorate the AbstractManifoldObjectiveo with specific decorators.

Optional Arguments

optional arguments provide necessary details on the decorators. A specific one is used to activate certain decorators.

• cache – (missing) currently only supports the SimpleCacheObjective which is activated by either specifying the symbol :Simple or the tuple (:Simple, kwargs...) to pass down keyword arguments

other keywords are ignored.

objective_cache_factory

source

Which has two main different possibilities for its containing functions concerning the evaluation mode – not necessarily the cost, but for example gradient in an AbstractManifoldGradientObjective.

It sometimes might be nice to set certain parameters within

## Cost Objective

Manopt.ManifoldCostObjectiveType
ManifoldCostObjective{T, TC} <: AbstractManifoldCostObjective{T, TC}

speficy an AbstractManifoldObjective that does only have information about the cost function $f\colon \mathbb M → ℝ$ implemented as a function (M, p) -> c to compute the cost value c at p on the manifold M.

• cost – a function $f: \mathcal M → ℝ$ to minimize

Constructors

ManifoldCostObjective(f)

Generate a problem. While this Problem does not have any allocating functions, the type T can be set for consistency reasons with other problems.

Used with

source

### Access functions

Manopt.get_costFunction
get_cost(amp::AbstractManoptProblem, p)

evaluate the cost function f stored within the AbstractManifoldObjective of an AbstractManoptProblem amp at the point p.

source
get_cost(M::AbstractManifold, obj::AbstractManifoldObjective, p)

evaluate the cost function f defined on M stored within the AbstractManifoldObjective at the point p.

source
get_cost(M::AbstractManifold, mco::AbstractManifoldCostObjective, p)

Evaluate the cost function from within the AbstractManifoldCostObjective on M at p.

By default this implementation assumed that the cost is stored within mco.cost.

source
Manopt.get_cost_functionFunction
get_cost_function(amco::AbstractManifoldCostObjective)

return the function to evaluate (just) the cost $f(p)=c$ as a function (M,p) -> c.

source

Manopt.ManifoldGradientObjectiveType
ManifoldGradientObjective{T<:AbstractEvaluationType} <: AbstractManifoldGradientObjective{T}

specify an objetive containing a cost and its gradient

Fields

• cost – a function $f\colon\mathcal M → ℝ$
• gradient!! – the gradient $\operatorname{grad}f\colon\mathcal M → \mathcal T\mathcal M$ of the cost function $f$.

Depending on the AbstractEvaluationType T the gradient can have to forms

Constructors

ManifoldGradientObjective(cost, gradient; evaluation=AllocatingEvaluation())

Used with

source
Manopt.ManifoldAlternatingGradientObjectiveType
ManifoldAlternatingGradientObjective{E<:AbstractEvaluationType,TCost,TGradient} <: AbstractManifoldGradientObjective{E}

An alternating gradient objective consists of

• a cost function $F(x)$
• a gradient $\operatorname{grad}F$ that is either
• given as one function $\operatorname{grad}F$ returning a tangent vector X on M or
• an array of gradient functions $\operatorname{grad}F_i$, ì=1,…,n s each returning a component of the gradient
which might be allocating or mutating variants, but not a mix of both.
Note

This Objective is usually defied using the ProductManifold from Manifolds.jl, so Manifolds.jl to be loaded.

Constructors

ManifoldAlternatingGradientObjective(F, gradF::Function;
evaluation=AllocatingEvaluation()
)
evaluation=AllocatingEvaluation()
)

Create a alternating gradient problem with an optional cost and the gradient either as one function (returning an array) or a vector of functions.

source
Manopt.ManifoldStochasticGradientObjectiveType
ManifoldStochasticGradientObjective{T<:AbstractEvaluationType} <: AbstractManifoldGradientObjective{T}

A stochastic gradient objective consists of

• a(n optional) cost function f(p) = \displaystyle\sum{i=1}^n fi(p)
• an array of gradients, $\operatorname{grad}f_i(p), i=1,\ldots,n$ which can be given in two forms
• as one single function $(\mathcal M, p) ↦ (X_1,…,X_n) \in (T_p\mathcal M)^n$
• as a vector of functions $\bigl( (\mathcal M, p) ↦ X_1, …, (\mathcal M, p) ↦ X_n\bigr)$.

Where both variants can also be provided as InplaceEvaluation functions, i.e. (M, X, p) -> X, where X is the vector of X1,...Xn and (M, X1, p) -> X1, ..., (M, Xn, p) -> Xn, respectively.

Constructors

ManifoldStochasticGradientObjective(
cost=Missing(),
evaluation=AllocatingEvaluation()
)
cost=Missing(), evaluation=AllocatingEvaluation()
)

Create a Stochastic gradient problem with an optional cost and the gradient either as one function (returning an array of tangent vectors) or a vector of functions (each returning one tangent vector).

Used with

stochastic_gradient_descent

Note that this can also be used with a gradient_descent, since the (complete) gradient is just the sums of the single gradients.

source
Manopt.NonlinearLeastSquaresObjectiveType
NonlinearLeastSquaresObjective{T<:AbstractEvaluationType} <: AbstractManifoldObjective{T}

A type for nonlinear least squares problems. T is a AbstractEvaluationType for the F and Jacobian functions.

Specify a nonlinear least squares problem

Fields

• F – a function $F: \mathcal M → ℝ^d$ to minimize
• jacF!! – Jacobian of the function $F$
• jacB – the basis of tangent space used for computing the Jacobian.
• num_components – number of values returned by F (equal to d).

Depending on the AbstractEvaluationType T the function $F$ has to be provided:

Also the Jacobian $jacF!!$ is required:

• as a functions (M::AbstractManifold, p; basis_domain::AbstractBasis) -> v that allocates memory for v itself for an AllocatingEvaluation,
• as a function (M::AbstractManifold, v, p; basis_domain::AbstractBasis) -> v that works in place of v for an InplaceEvaluation.

Constructors

NonlinearLeastSquaresProblem(M, F, jacF, num_components; evaluation=AllocatingEvaluation(), jacB=DefaultOrthonormalBasis())

source

There is also a second variant, if just one function is responsible for computing the cost and the gradient

Manopt.ManifoldCostGradientObjectiveType
ManifoldCostGradientObjective{T} <: AbstractManifoldObjective{T}

specify an objetive containing one function to perform a combined computation of cost and its gradient

Fields

• costgrad!! – a function that computes both the cost $f\colon\mathcal M → ℝ$ and its gradient $\operatorname{grad}f\colon\mathcal M → \mathcal T\mathcal M$

Depending on the AbstractEvaluationType T the gradient can have to forms

Constructors

ManifoldCostGradientObjective(costgrad; evaluation=AllocatingEvaluation())

Used with

source

### Access functions

Manopt.get_gradientFunction
get_gradient(s::AbstractManoptSolverState)

return the (last stored) gradient within AbstractManoptSolverStates. By default also undecorates the state beforehand

source
get_gradient(amp::AbstractManoptProblem, p)
get_gradient!(amp::AbstractManoptProblem, X, p)

evaluate the gradient of an AbstractManoptProblem amp at the point p.

The evaluation is done in place of X for the !-variant.

source
get_gradient(M::AbstractManifold, mgo::AbstractManifoldGradientObjective{T}, p)
get_gradient!(M::AbstractManifold, X, mgo::AbstractManifoldGradientObjective{T}, p)

evaluate the gradient of a AbstractManifoldGradientObjective{T} mgo at p.

The evaluation is done in place of X for the !-variant. The T=AllocatingEvaluation problem might still allocate memory within. When the non-mutating variant is called with a T=InplaceEvaluation memory for the result is allocated.

Note that the order of parameters follows the philisophy of Manifolds.jl, namely that even for the mutating variant, the manifold is the first parameter and the (inplace) tangent vector X comes second.

source
get_gradient(agst::AbstractGradientSolverState)

return the gradient stored within gradient options. THe default resturns agst.X.

source
get_gradient(M::AbstractManifold, sgo::ManifoldStochasticGradientObjective, p, k)
get_gradient!(M::AbstractManifold, sgo::ManifoldStochasticGradientObjective, Y, p, k)

Evaluate one of the summands gradients $\operatorname{grad}f_k$, $k∈\{1,…,n\}$, at x (in place of Y).

If you use a single function for the stochastic gradient, that works inplace, then get_gradient is not available, since the length (or number of elements of the gradient required for allocation) can not be determined.

source
get_gradient(M::AbstractManifold, sgo::ManifoldStochasticGradientObjective, p)
get_gradient!(M::AbstractManifold, sgo::ManifoldStochasticGradientObjective, X, p)

Evaluate the complete gradient $\operatorname{grad} f = \displaystyle\sum_{i=1}^n \operatorname{grad} f_i(p)$ at p (in place of X).

If you use a single function for the stochastic gradient, that works inplace, then get_gradient is not available, since the length (or number of elements of the gradient required for allocation) can not be determined.

source
X = get_gradient(M::ProductManifold, ago::ManifoldAlternatingGradientObjective, p)
get_gradient!(M::ProductManifold, P::ManifoldAlternatingGradientObjective, X, p)

Evaluate all summands gradients at a point p on the ProductManifold M (in place of X)

source
X = get_gradient(M::AbstractManifold, p::ManifoldAlternatingGradientObjective, p, k)
get_gradient!(M::AbstractManifold, p::ManifoldAlternatingGradientObjective, X, p, k)

Evaluate one of the component gradients $\operatorname{grad}f_k$, $k∈\{1,…,n\}$, at x (in place of Y).

source
Manopt.get_gradientsFunction
get_gradients(M::AbstractManifold, sgo::ManifoldStochasticGradientObjective, p)
get_gradients!(M::AbstractManifold, X, sgo::ManifoldStochasticGradientObjective, p)

Evaluate all summands gradients $\{\operatorname{grad}f_i\}_{i=1}^n$ at p (in place of X).

If you use a single function for the stochastic gradient, that works inplace, then get_gradient is not available, since the length (or number of elements of the gradient) can not be determined.

source
Manopt.get_gradient_functionFunction
get_gradient_function(amgo::AbstractManifoldGradientObjective{E<:AbstractEvaluationType})

return the function to evaluate (just) the gradient $\operatorname{grad} f(p)$. Depending on the AbstractEvaluationType E this is a function

source

Manopt.ManifoldSubgradientObjectiveType
ManifoldSubgradientObjective{T<:AbstractEvaluationType,C,S} <:AbstractManifoldCostObjective{T, C}

A structure to store information about a objective for a subgradient based optimization problem

Fields

• cost – the function $F$ to be minimized
• subgradient – a function returning a subgradient $\partial F$ of $F$

Constructor

ManifoldSubgradientObjective(f, ∂f)

Generate the ManifoldSubgradientObjective for a subgradient objective, i.e. a (cost) function f(M, p) and a function ∂f(M, p) that returns a not necessarily deterministic element from the subdifferential at p on a manifold M.

source

### Access Functions

Manopt.get_subgradientFunction
get_subgradient(amp::AbstractManoptProblem, p)
get_subgradient!(amp::AbstractManoptProblem, X, p)

evaluate the subgradient of an AbstractManoptProblem amp at point p.

The evaluation is done in place of X for the !-variant. The result might not be deterministic, one element of the subdifferential is returned.

source
X = get_subgradient(M;;AbstractManifold, sgo::ManifoldSubgradientObjective, p)
get_subgradient!(M;;AbstractManifold, X, sgo::ManifoldSubgradientObjective, p)

Evaluate the (sub)gradient of a ManifoldSubgradientObjective sgo at the point p.

The evaluation is done in place of X for the !-variant. The result might not be deterministic, one element of the subdifferential is returned.

source

## Proximal Map Objective

Manopt.ManifoldProximalMapObjectiveType
ManifoldProximalMapObjective{E<:AbstractEvaluationType, TC, TP, V <: Vector{<:Integer}} <: AbstractManifoldCostObjective{E, TC}

specify a problem for solvers based on the evaluation of proximal map(s).

Fields

• cost - a function $F:\mathcal M→ℝ$ to minimize
• proxes - proximal maps $\operatorname{prox}_{λ\varphi}:\mathcal M→\mathcal M$ as functions (M, λ, p) -> q.
• number_of_proxes - (ones(length(proxes)) number of proximal Maps per function, e.g. if one of the maps is a combined one such that the proximal Maps functions return more than one entry per function, you have to adapt this value. if not speciifed, it is set to one prox per function.

source

### Access Functions

Manopt.get_proximal_mapFunction
q = get_proximal_map(M::AbstractManifold, mpo::ManifoldProximalMapObjective, λ, p, i)
get_proximal_map!(M::AbstractManifold, q, mpo::ManifoldProximalMapObjective, λ, p, i)

evaluate the ith proximal map of ManifoldProximalMapObjective p at the point p of p.M with parameter $λ>0$.

source

## Hessian Objective

Manopt.ManifoldHessianObjectiveType
ManifoldHessianObjective{T<:AbstractEvaluationType,C,G,H,Pre} <: AbstractManifoldGradientObjective{T}

specify a problem for hessian based algorithms.

Fields

• cost : a function $F:\mathcal M→ℝ$ to minimize
• gradient : the gradient $\operatorname{grad}F:\mathcal M → \mathcal T\mathcal M$ of the cost function $F$
• hessian : the hessian $\operatorname{Hess}F(x)[⋅]: \mathcal T_{x} \mathcal M → \mathcal T_{x} \mathcal M$ of the cost function $F$
• preconditioner : the symmetric, positive definite preconditioner as an approximation of the inverse of the Hessian of $f$, i.e. as a map with the same input variables as the hessian.

Depending on the AbstractEvaluationType T the gradient and can have to forms

Constructor

ManifoldHessianObjective(f, grad_f, Hess_f, preconditioner = (M, p, X) -> X;
evaluation=AllocatingEvaluation())

source

### Access functions

Manopt.get_hessianFunction
Y = get_hessian(amp::AbstractManoptProblem{T}, p, X)
get_hessian!(amp::AbstractManoptProblem{T}, Y, p, X)

evaluate the Hessian of an AbstractManoptProblem amp at p applied to a tangent vector X, i.e. compute $\operatorname{Hess}f(q)[X]$, which can also happen in-place of Y.

source
Manopt.get_preconditionerFunction
get_preconditioner(amp::AbstractManoptProblem, p, X)

evaluate the symmetric, positive definite preconditioner (approximation of the inverse of the Hessian of the cost function f) of a AbstractManoptProblem amps objective at the point p applied to a tangent vector X.

source
get_preconditioner(M::AbstractManifold, mho::ManifoldHessianObjective, p, X)

evaluate the symmetric, positive definite preconditioner (approximation of the inverse of the Hessian of the cost function F) of a ManifoldHessianObjective mho at the point p applied to a tangent vector X.

source

## Primal-Dual based Objetives

Manopt.PrimalDualManifoldObjectiveType
PrimalDualManifoldObjective{E<:AbstractEvaluationType} <: AbstractPrimalDualManifoldObjective{E}

Describes an Objective linearized or exact Chambolle-Pock algorithm.[BergmannHerzogSilvaLouzeiroTenbrinckVidalNunez2020][ChambollePock2011]

Fields

All fields with !! can either be mutating or nonmutating functions, which should be set depenting on the parameter T <: AbstractEvaluationType.

• cost $F + G(Λ(⋅))$ to evaluate interims cost function values
• linearized_forward_operator!! linearized operator for the forward operation in the algorithm $DΛ$
• linearized_adjoint_operator!! The adjoint differential $(DΛ)^* : \mathcal N → T\mathcal M$
• prox_f!! the proximal map belonging to $f$
• prox_G_dual!! the proximal map belonging to $g_n^*$
• Λ!! – (fordward_operator) the forward operator (if given) $Λ: \mathcal M → \mathcal N$

Either the linearized operator $DΛ$ or $Λ$ are required usually.

Constructor

PrimalDualManifoldObjective(cost, prox_f, prox_G_dual, adjoint_linearized_operator;
linearized_forward_operator::Union{Function,Missing}=missing,
Λ::Union{Function,Missing}=missing,
evaluation::AbstractEvaluationType=AllocatingEvaluation()
)

The last optional argument can be used to provide the 4 or 5 functions as allocating or mutating (in place computation) ones. Note that the first argument is always the manifold under consideration, the mutated one is the second.

source
Manopt.PrimalDualManifoldSemismoothNewtonObjectiveType
PrimalDualManifoldSemismoothNewtonObjective{E<:AbstractEvaluationType, TC, LO, ALO, PF, DPF, PG, DPG, L} <: AbstractPrimalDualManifoldObjective{E, TC, PF}

Describes a Problem for the Primal-dual Riemannian semismooth Newton algorithm. [DiepeveenLellmann2021]

Fields

• cost $F + G(Λ(⋅))$ to evaluate interims cost function values
• linearized_operator the linearization $DΛ(⋅)[⋅]$ of the operator $Λ(⋅)$.
• linearized_adjoint_operator The adjoint differential $(DΛ)^* \colon \mathcal N \to T\mathcal M$
• prox_F the proximal map belonging to $f$
• diff_prox_F the (Clarke Generalized) differential of the proximal maps of $F$
• prox_G_dual the proximal map belonging to $g_n^*$
• diff_prox_dual_G the (Clarke Generalized) differential of the proximal maps of $G^\ast_n$
• Λ – the exact forward operator. This operator is required if Λ(m)=n does not hold.

Constructor

PrimalDualManifoldSemismoothNewtonObjective(cost, prox_F, prox_G_dual, forward_operator, adjoint_linearized_operator,Λ)
source

### Access functions

Manopt.adjoint_linearized_operatorFunction
X = adjoint_linearized_operator(N::AbstractManifold, apdmo::AbstractPrimalDualManifoldObjective, m, n, Y)
adjoint_linearized_operator(N::AbstractManifold, X, apdmo::AbstractPrimalDualManifoldObjective, m, n, Y)

Evaluate the adjoint of the linearized forward operator of $(DΛ(m))^*[Y]$ stored within the AbstractPrimalDualManifoldObjective (in place of X). Since $Y∈T_n\mathcal N$, both $m$ and $n=Λ(m)$ are necessary arguments, mainly because the forward operator $Λ$ might be missing in p.

source
Manopt.forward_operatorFunction
q = forward_operator(M::AbstractManifold, N::AbstractManifold, apdmo::AbstractPrimalDualManifoldObjective, p)
forward_operator!(M::AbstractManifold, N::AbstractManifold, q, apdmo::AbstractPrimalDualManifoldObjective, p)

Evaluate the forward operator of $Λ(x)$ stored within the TwoManifoldProblem (in place of q).

source
Manopt.get_differential_dual_proxFunction
η = get_differential_dual_prox(N::AbstractManifold, pdsno::PrimalDualManifoldSemismoothNewtonObjective, n, τ, X, ξ)
get_differential_dual_prox!(N::AbstractManifold, pdsno::PrimalDualManifoldSemismoothNewtonObjective, η, n, τ, X, ξ)

Evaluate the differential proximal map of $G_n^*$ stored within PrimalDualManifoldSemismoothNewtonObjective

$$$D\operatorname{prox}_{τG_n^*}(X)[ξ]$$$

which can also be computed in place of η.

source
Manopt.get_differential_primal_proxFunction
y = get_differential_primal_prox(M::AbstractManifold, pdsno::PrimalDualManifoldSemismoothNewtonObjective σ, x)
get_differential_primal_prox!(p::TwoManifoldProblem, y, σ, x)

Evaluate the differential proximal map of $F$ stored within AbstractPrimalDualManifoldObjective

$$$D\operatorname{prox}_{σF}(x)[X]$$$

which can also be computed in place of y.

source
Manopt.get_dual_proxFunction
Y = get_dual_prox(N::AbstractManifold, apdmo::AbstractPrimalDualManifoldObjective, n, τ, X)
get_dual_prox!(N::AbstractManifold, apdmo::AbstractPrimalDualManifoldObjective, Y, n, τ, X)

Evaluate the proximal map of $g_n^*$ stored within AbstractPrimalDualManifoldObjective

$$$Y = \operatorname{prox}_{τG_n^*}(X)$$$

which can also be computed in place of Y.

source
Manopt.get_primal_proxFunction
q = get_primal_prox(M::AbstractManifold, p::AbstractPrimalDualManifoldObjective, σ, p)
get_primal_prox!(M::AbstractManifold, p::AbstractPrimalDualManifoldObjective, q, σ, p)

Evaluate the proximal map of $F$ stored within AbstractPrimalDualManifoldObjective

$$$\operatorname{prox}_{σF}(x)$$$

which can also be computed in place of y.

source
Manopt.linearized_forward_operatorFunction
Y = linearized_forward_operator(M::AbstractManifold, N::AbstractManifold, apdmo::AbstractPrimalDualManifoldObjective, m, X, n)
linearized_forward_operator!(M::AbstractManifold, N::AbstractManifold, Y, apdmo::AbstractPrimalDualManifoldObjective, m, X, n)

Evaluate the linearized operator (differential) $DΛ(m)[X]$ stored within the AbstractPrimalDualManifoldObjective (in place of Y), where n = Λ(m).

source

## Constrained Objective

Besides the AbstractEvaluationType there is one further property to distinguish among constraint functions, especially the gradients of the constraints.

Manopt.VectorConstraintType
VectorConstraint <: ConstraintType

A type to indicate that constraints are implemented a vector of functions, e.g. $g_i(p) ∈ \mathbb R, i=1,…,m$.

source

The ConstraintType is a parameter of the corresponding Objective.

Manopt.ConstrainedManifoldObjectiveType
ConstrainedManifoldObjective{T<:AbstractEvaluationType, C <: ConstraintType Manifold} <: AbstractManifoldObjective{T}

Describes the constrained objective

\begin{aligned} \operatorname*{arg\,min}_{p ∈\mathcal{M}} & f(p)\\ \text{subject to } &g_i(p)\leq0 \quad \text{ for all } i=1,…,m,\\ \quad &h_j(p)=0 \quad \text{ for all } j=1,…,n. \end{aligned}

It consists of

• an cost function $f(p)$
• the gradient of $f$, $\operatorname{grad}f(p)$ AbstractManifoldGradientObjective
• inequality constraints $g(p)$, either a function g returning a vector or a vector [g1, g2,...,gm] of functions.
• equality constraints $h(p)$, either a function h returning a vector or a vector [h1, h2,...,hn] of functions.
• gradient(s) of the inequality constraints $\operatorname{grad}g(p) ∈ (T_p\mathcal M)^m$, either a function or a vector of functions.
• gradient(s) of the equality constraints $\operatorname{grad}h(p) ∈ (T_p\mathcal M)^n$, either a function or a vector of functions.

There are two ways to specify the constraints $g$ and $h$.

1. as one Function returning a vector in $\mathbb R^m$ and $\mathbb R^n$ respecively. This might be easier to implement but requires evaluating all constraints even if only one is needed.
2. as a AbstractVector{<:Function} where each function returns a real number. This requires each constrant to be implemented as a single function, but it is possible to evaluate also only a single constraint.

The gradients $\operatorname{grad}g$, $\operatorname{grad}h$ have to follow the same form. Additionally they can be implemented as in-place functions or as allocating ones. The gradient $\operatorname{grad}F$ has to be the same kind. This difference is indicated by the evaluation keyword.

Constructors

ConstrainedManifoldObjective(f, grad_f, g, grad_g, h, grad_h;
evaluation=AllocatingEvaluation()
)

Where f, g, h describe the cost, inequality and equality constraints, respecitvely, as described above and grad_f, grad_g, grad_h are the corresponding gradient functions in one of the 4 formats. If the objective does not have inequality constraints, you can set G and gradG no nothing. If the problem does not have equality constraints, you can set H and gradH no nothing or leave them out.

ConstrainedManifoldObjective(M::AbstractManifold, F, gradF;
evaluation=AllocatingEvaluation()
)

A keyword argument variant of the constructor above, where you can leave out either G and gradG or H and gradH but not both.

source

### Access functions

Manopt.get_grad_equality_constraintFunction
get_grad_equality_constraint(M::AbstractManifold, co::ConstrainedManifoldObjective, p, j)

evaluate the gradient of the j th equality constraint $(\operatorname{grad} h(p))_j$ or $\operatorname{grad} h_j(x)$.

Note

For the FunctionConstraint variant of the problem, this function still evaluates the full gradient. For the InplaceEvaluation and FunctionConstraint of the problem, this function currently also calls get_equality_constraints, since this is the only way to determine the number of cconstraints. It also allocates a full tangent vector.

source
Manopt.get_grad_equality_constraintsFunction
get_grad_equality_constraints(M::AbstractManifold, co::ConstrainedManifoldObjective, p)

eevaluate all gradients of the equality constraints $\operatorname{grad} h(x)$ or $\bigl(\operatorname{grad} h_1(x), \operatorname{grad} h_2(x),\ldots, \operatorname{grad}h_n(x)\bigr)$ of the ConstrainedManifoldObjective P at p.

Note

for the InplaceEvaluation and FunctionConstraint variant of the problem, this function currently also calls get_equality_constraints, since this is the only way to determine the number of cconstraints.

source
Manopt.get_grad_equality_constraints!Function
get_grad_equality_constraints!(M::AbstractManifold, X, co::ConstrainedManifoldObjective, p)

evaluate all gradients of the equality constraints $\operatorname{grad} h(p)$ or $\bigl(\operatorname{grad} h_1(p), \operatorname{grad} h_2(p),\ldots,\operatorname{grad} h_n(p)\bigr)$ of the ConstrainedManifoldObjective $P$ at $p$ in place of X, which is a vector ofn tangent vectors.

source
Manopt.get_grad_equality_constraint!Function
get_grad_equality_constraint!(M::AbstractManifold, X, co::ConstrainedManifoldObjective, p, j)

Evaluate the gradient of the jth equality constraint $(\operatorname{grad} h(x))_j$ or $\operatorname{grad} h_j(x)$ in place of $X$

Note

For the FunctionConstraint variant of the problem, this function still evaluates the full gradient. For the InplaceEvaluation of the FunctionConstraint of the problem, this function currently also calls get_inequality_constraints, since this is the only way to determine the number of cconstraints and allocates a full vector of tangent vectors

source
Manopt.get_grad_inequality_constraintFunction
get_grad_inequality_constraint(M::AbstractManifold, co::ConstrainedManifoldObjective, p, i)

Evaluate the gradient of the i th inequality constraints $(\operatorname{grad} g(x))_i$ or $\operatorname{grad} g_i(x)$.

Note

For the FunctionConstraint variant of the problem, this function still evaluates the full gradient. For the InplaceEvaluation and FunctionConstraint of the problem, this function currently also calls get_inequality_constraints, since this is the only way to determine the number of cconstraints.

source
Manopt.get_grad_inequality_constraint!Function
get_grad_inequality_constraint!(P, X, p, i)

Evaluate the gradient of the ith inequality constraints $(\operatorname{grad} g(x))_i$ or $\operatorname{grad} g_i(x)$ of the ConstrainedManifoldObjective P in place of $X$

Note

For the FunctionConstraint variant of the problem, this function still evaluates the full gradient. For the InplaceEvaluation and FunctionConstraint of the problem, this function currently also calls get_inequality_constraints,

since this is the only way to determine the number of cconstraints. evaluate all gradients of the inequality constraints $\operatorname{grad} h(x)$ or $\bigl(g_1(x), g_2(x),\ldots,g_m(x)\bigr)$ of the ConstrainedManifoldObjective $p$ at $x$ in place of X, which is a vector ofm tangent vectors .

source
Manopt.get_grad_inequality_constraintsFunction
get_grad_inequality_constraints(M::AbstractManifold, co::ConstrainedManifoldObjective, p)

evaluate all gradients of the inequality constraints $\operatorname{grad} g(p)$ or $\bigl(\operatorname{grad} g_1(p), \operatorname{grad} g_2(p),…,\operatorname{grad} g_m(p)\bigr)$ of the ConstrainedManifoldObjective $P$ at $p$.

Note

for the InplaceEvaluation and FunctionConstraint variant of the problem, this function currently also calls get_equality_constraints, since this is the only way to determine the number of cconstraints.

source
Manopt.get_grad_inequality_constraints!Function
get_grad_inequality_constraints!(M::AbstractManifold, X, co::ConstrainedManifoldObjective, p)

evaluate all gradients of the inequality constraints $\operatorname{grad} g(x)$ or $\bigl(\operatorname{grad} g_1(x), \operatorname{grad} g_2(x),\ldots,\operatorname{grad} g_m(x)\bigr)$ of the ConstrainedManifoldObjective P at p in place of X, which is a vector of $m$ tangent vectors.

source

## Cache Objective

Since single function calls, e.g. to the cost or the gradient, might be expensive, a simple cache objective exists as a decorator, that caches one cost value or gradient.

This feature was just recently introduced in Manopt 0.4 and might still be a little unstable. The cache::Symbol= keyword argument of the solvers might be extended or still change slightly for example.

Manopt.SimpleCacheObjectiveType
 SimpleCacheObjective{O<:AbstractManifoldGradientObjective{E,TC,TG}, P, T,C} <: AbstractManifoldGradientObjective{E,TC,TG}

Provide a simple cache for an AbstractManifoldGradientObjective that is for a given point p this cache stores a point p and a gradient $\operatorname{grad} f(p)$ in X as well as a cost value $f(p)$ in c.

Both X and c are accompanied by booleans to keep track of their validity.

Constructor

SimpleCacheObjective(M::AbstractManifold, obj::AbstractManifoldGradientObjective; kwargs...)

Keyword

• p (rand(M)) – a point on the manifold to initialize the cache with
• X (get_gradient(M, obj, p) or zero_vector(M,p)) – a tangent vector to store the gradient in, see also initialize
• c (get_cost(M, obj, p) or 0.0) – a value to store the cost function in initialize
• initialized (true) – whether to initialize the cached X and c or not.
source
Manopt.objective_cache_factoryFunction
objective_cache_factory(M::AbstractManifold, o::AbstractManifoldObjective, cache::Symbol)

Generate a cached variant of the AbstractManifoldObjective o on the AbstractManifold M based on the symbol cache.

The following caches are available

source
objective_cache_factory(M::AbstractManifold, o::AbstractManifoldObjective, cache::Tuple{Symbol, Array}Symbol)

Generate a cached variant of the AbstractManifoldObjective o on the AbstractManifold M based on the symbol cache[1], where the second element cache[2]` is an array of further arguments for the cache and the third is passed down as keyword arguments.

For all availabel caches see the simpler variant with symbols.

source