A Manifold Objective
The Objective describes that actual cost function and all its properties.
Manopt.AbstractManifoldObjective
— TypeAbstractManifoldObjective{T<:AbstractEvaluationType}
Describe the collection of the optimization function `f\colon \mathcal M → \bbR
(or even a vectorial range) and its corresponding elements, which might for example be a gradient or (one or more) prxomial maps.
All these elements should usually be implemented as functions (M, p) -> ...
, or (M, X, p) -> ...
that is
- the first argument of these functions should be the manifold
M
they are defined on - the argument
X
is present, if the computation is performed inplace ofX
(seeInplaceEvaluation
) - the argument
p
is the place the function ($f$ or one of its elements) is evaluated at.
the type T
indicates the global AbstractEvaluationType
.
Manopt.decorate_objective!
— Functiondecorate_objective!(M, o::AbstractManifoldObjective)
decorate the AbstractManifoldObjective
o
with specific decorators.
Optional Arguments
optional arguments provide necessary details on the decorators. A specific one is used to activate certain decorators.
cache
– (missing
) currently only supports theSimpleCacheObjective
which is activated by either specifying the symbol:Simple
or the tuple (:Simple, kwargs...
) to pass down keyword arguments
other keywords are ignored.
See also
Which has two main different possibilities for its containing functions concerning the evaluation mode – not necessarily the cost, but for example gradient in an AbstractManifoldGradientObjective
.
Manopt.AbstractEvaluationType
— TypeAbstractEvaluationType
An abstract type to specify the kind of evaluation a AbstractManifoldObjective
supports.
Manopt.AllocatingEvaluation
— TypeAllocatingEvaluation <: AbstractEvaluationType
A parameter for a AbstractManoptProblem
indicating that the problem uses functions that allocate memory for their result, i.e. they work out of place.
Manopt.InplaceEvaluation
— TypeInplaceEvaluation <: AbstractEvaluationType
A parameter for a AbstractManoptProblem
indicating that the problem uses functions that do not allocate memory but work on their input, i.e. in place.
Manopt.evaluation_type
— Functionevaluation_type(mp::AbstractManoptProblem)
Get the AbstractEvaluationType
of the objective in AbstractManoptProblem
mp
.
evaluation_type(::AbstractManifoldObjective{Teval})
Get the AbstractEvaluationType
of the objective.
It sometimes might be nice to set certain parameters within
Cost Objective
Manopt.AbstractManifoldCostObjective
— TypeAbstractManifoldCostObjective{T<:AbstractEvaluationType} <: AbstractManifoldObjective{T}
Representing objectives on manifolds with a cost function implemented.
Manopt.ManifoldCostObjective
— TypeManifoldCostObjective{T, TC} <: AbstractManifoldCostObjective{T, TC}
speficy an AbstractManifoldObjective
that does only have information about the cost function $f\colon \mathbb M → ℝ$ implemented as a function (M, p) -> c
to compute the cost value c
at p
on the manifold M
.
cost
– a function $f: \mathcal M → ℝ$ to minimize
Constructors
ManifoldCostObjective(f)
Generate a problem. While this Problem does not have any allocating functions, the type T
can be set for consistency reasons with other problems.
Used with
Access functions
Manopt.get_cost
— Functionget_cost(amp::AbstractManoptProblem, p)
evaluate the cost function f
stored within the AbstractManifoldObjective
of an AbstractManoptProblem
amp
at the point p
.
get_cost(M::AbstractManifold, obj::AbstractManifoldObjective, p)
evaluate the cost function f
defined on M
stored within the AbstractManifoldObjective
at the point p
.
get_cost(M::AbstractManifold, mco::AbstractManifoldCostObjective, p)
Evaluate the cost function from within the AbstractManifoldCostObjective
on M
at p
.
By default this implementation assumed that the cost is stored within mco.cost
.
Manopt.get_cost_function
— Functionget_cost_function(amco::AbstractManifoldCostObjective)
return the function to evaluate (just) the cost $f(p)=c$ as a function (M,p) -> c
.
Gradient Objectives
Manopt.AbstractManifoldGradientObjective
— TypeAbstractManifoldGradientObjective{E<:AbstractEvaluationType, TC, TG} <: AbstractManifoldCostObjective{E, TC}
An abstract type for all functions that provide a (full) gradient, where T
is a AbstractEvaluationType
for the gradient function.
Manopt.ManifoldGradientObjective
— TypeManifoldGradientObjective{T<:AbstractEvaluationType} <: AbstractManifoldGradientObjective{T}
specify an objetive containing a cost and its gradient
Fields
cost
– a function $f\colon\mathcal M → ℝ$gradient!!
– the gradient $\operatorname{grad}f\colon\mathcal M → \mathcal T\mathcal M$ of the cost function $f$.
Depending on the AbstractEvaluationType
T
the gradient can have to forms
- as a function
(M, p) -> X
that allocates memory forX
, i.e. anAllocatingEvaluation
- as a function
(M, X, p) -> X
that work in place ofX
, i.e. anInplaceEvaluation
Constructors
ManifoldGradientObjective(cost, gradient; evaluation=AllocatingEvaluation())
Used with
Manopt.ManifoldAlternatingGradientObjective
— TypeManifoldAlternatingGradientObjective{E<:AbstractEvaluationType,TCost,TGradient} <: AbstractManifoldGradientObjective{E}
An alternating gradient objective consists of
- a cost function $F(x)$
- a gradient $\operatorname{grad}F$ that is either
- given as one function $\operatorname{grad}F$ returning a tangent vector
X
onM
or - an array of gradient functions $\operatorname{grad}F_i$,
ì=1,…,n
s each returning a component of the gradient
- given as one function $\operatorname{grad}F$ returning a tangent vector
This Objective is usually defied using the ProductManifold
from Manifolds.jl
, so Manifolds.jl
to be loaded.
Constructors
ManifoldAlternatingGradientObjective(F, gradF::Function;
evaluation=AllocatingEvaluation()
)
ManifoldAlternatingGradientObjective(F, gradF::AbstractVector{<:Function};
evaluation=AllocatingEvaluation()
)
Create a alternating gradient problem with an optional cost
and the gradient either as one function (returning an array) or a vector of functions.
Manopt.ManifoldStochasticGradientObjective
— TypeManifoldStochasticGradientObjective{T<:AbstractEvaluationType} <: AbstractManifoldGradientObjective{T}
A stochastic gradient objective consists of
- a(n optional) cost function ``f(p) = \displaystyle\sum{i=1}^n fi(p)
- an array of gradients, $\operatorname{grad}f_i(p), i=1,\ldots,n$ which can be given in two forms
- as one single function $(\mathcal M, p) ↦ (X_1,…,X_n) \in (T_p\mathcal M)^n$
- as a vector of functions $\bigl( (\mathcal M, p) ↦ X_1, …, (\mathcal M, p) ↦ X_n\bigr)$.
Where both variants can also be provided as InplaceEvaluation
functions, i.e. (M, X, p) -> X
, where X
is the vector of X1,...Xn
and (M, X1, p) -> X1, ..., (M, Xn, p) -> Xn
, respectively.
Constructors
ManifoldStochasticGradientObjective(
grad_f::Function;
cost=Missing(),
evaluation=AllocatingEvaluation()
)
ManifoldStochasticGradientObjective(
grad_f::AbstractVector{<:Function};
cost=Missing(), evaluation=AllocatingEvaluation()
)
Create a Stochastic gradient problem with an optional cost
and the gradient either as one function (returning an array of tangent vectors) or a vector of functions (each returning one tangent vector).
Used with
Note that this can also be used with a gradient_descent
, since the (complete) gradient is just the sums of the single gradients.
Manopt.NonlinearLeastSquaresObjective
— TypeNonlinearLeastSquaresObjective{T<:AbstractEvaluationType} <: AbstractManifoldObjective{T}
A type for nonlinear least squares problems. T
is a AbstractEvaluationType
for the F
and Jacobian functions.
Specify a nonlinear least squares problem
Fields
F
– a function $F: \mathcal M → ℝ^d$ to minimizejacF!!
– Jacobian of the function $F$jacB
– the basis of tangent space used for computing the Jacobian.num_components
– number of values returned byF
(equal tod
).
Depending on the AbstractEvaluationType
T
the function $F$ has to be provided:
- as a functions
(M::AbstractManifold, p) -> v
that allocates memory forv
itself for anAllocatingEvaluation
, - as a function
(M::AbstractManifold, v, p) -> v
that works in place ofv
for aInplaceEvaluation
.
Also the Jacobian $jacF!!$ is required:
- as a functions
(M::AbstractManifold, p; basis_domain::AbstractBasis) -> v
that allocates memory forv
itself for anAllocatingEvaluation
, - as a function
(M::AbstractManifold, v, p; basis_domain::AbstractBasis) -> v
that works in place ofv
for anInplaceEvaluation
.
Constructors
NonlinearLeastSquaresProblem(M, F, jacF, num_components; evaluation=AllocatingEvaluation(), jacB=DefaultOrthonormalBasis())
See also
There is also a second variant, if just one function is responsible for computing the cost and the gradient
Manopt.ManifoldCostGradientObjective
— TypeManifoldCostGradientObjective{T} <: AbstractManifoldObjective{T}
specify an objetive containing one function to perform a combined computation of cost and its gradient
Fields
costgrad!!
– a function that computes both the cost $f\colon\mathcal M → ℝ$ and its gradient $\operatorname{grad}f\colon\mathcal M → \mathcal T\mathcal M$
Depending on the AbstractEvaluationType
T
the gradient can have to forms
- as a function
(M, p) -> (c, X)
that allocates memory for the gradientX
, i.e. anAllocatingEvaluation
- as a function
(M, X, p) -> (c, X)
that work in place ofX
, i.e. anInplaceEvaluation
Constructors
ManifoldCostGradientObjective(costgrad; evaluation=AllocatingEvaluation())
Used with
Access functions
Manopt.get_gradient
— Functionget_gradient(s::AbstractManoptSolverState)
return the (last stored) gradient within AbstractManoptSolverState
s`. By default also undecorates the state beforehand
get_gradient(amp::AbstractManoptProblem, p)
get_gradient!(amp::AbstractManoptProblem, X, p)
evaluate the gradient of an AbstractManoptProblem
amp
at the point p
.
The evaluation is done in place of X
for the !
-variant.
get_gradient(M::AbstractManifold, mgo::AbstractManifoldGradientObjective{T}, p)
get_gradient!(M::AbstractManifold, X, mgo::AbstractManifoldGradientObjective{T}, p)
evaluate the gradient of a AbstractManifoldGradientObjective{T}
mgo
at p
.
The evaluation is done in place of X
for the !
-variant. The T=
AllocatingEvaluation
problem might still allocate memory within. When the non-mutating variant is called with a T=
InplaceEvaluation
memory for the result is allocated.
Note that the order of parameters follows the philisophy of Manifolds.jl
, namely that even for the mutating variant, the manifold is the first parameter and the (inplace) tangent vector X
comes second.
get_gradient(agst::AbstractGradientSolverState)
return the gradient stored within gradient options. THe default resturns agst.X
.
get_gradient(M::AbstractManifold, sgo::ManifoldStochasticGradientObjective, p, k)
get_gradient!(M::AbstractManifold, sgo::ManifoldStochasticGradientObjective, Y, p, k)
Evaluate one of the summands gradients $\operatorname{grad}f_k$, $k∈\{1,…,n\}$, at x
(in place of Y
).
If you use a single function for the stochastic gradient, that works inplace, then get_gradient
is not available, since the length (or number of elements of the gradient required for allocation) can not be determined.
get_gradient(M::AbstractManifold, sgo::ManifoldStochasticGradientObjective, p)
get_gradient!(M::AbstractManifold, sgo::ManifoldStochasticGradientObjective, X, p)
Evaluate the complete gradient $\operatorname{grad} f = \displaystyle\sum_{i=1}^n \operatorname{grad} f_i(p)$ at p
(in place of X
).
If you use a single function for the stochastic gradient, that works inplace, then get_gradient
is not available, since the length (or number of elements of the gradient required for allocation) can not be determined.
X = get_gradient(M::ProductManifold, ago::ManifoldAlternatingGradientObjective, p)
get_gradient!(M::ProductManifold, P::ManifoldAlternatingGradientObjective, X, p)
Evaluate all summands gradients at a point p
on the ProductManifold M
(in place of X
)
X = get_gradient(M::AbstractManifold, p::ManifoldAlternatingGradientObjective, p, k)
get_gradient!(M::AbstractManifold, p::ManifoldAlternatingGradientObjective, X, p, k)
Evaluate one of the component gradients $\operatorname{grad}f_k$, $k∈\{1,…,n\}$, at x
(in place of Y
).
Manopt.get_gradients
— Functionget_gradients(M::AbstractManifold, sgo::ManifoldStochasticGradientObjective, p)
get_gradients!(M::AbstractManifold, X, sgo::ManifoldStochasticGradientObjective, p)
Evaluate all summands gradients $\{\operatorname{grad}f_i\}_{i=1}^n$ at p
(in place of X
).
If you use a single function for the stochastic gradient, that works inplace, then get_gradient
is not available, since the length (or number of elements of the gradient) can not be determined.
Manopt.get_gradient_function
— Functionget_gradient_function(amgo::AbstractManifoldGradientObjective{E<:AbstractEvaluationType})
return the function to evaluate (just) the gradient $\operatorname{grad} f(p)$. Depending on the AbstractEvaluationType
E
this is a function
(M, p) -> X
for theAllocatingEvaluation
case(M, X, p) -> X
for theInplaceEvaluation
, i.e. working inplace ofX
.
Subgradient Objective
Manopt.ManifoldSubgradientObjective
— TypeManifoldSubgradientObjective{T<:AbstractEvaluationType,C,S} <:AbstractManifoldCostObjective{T, C}
A structure to store information about a objective for a subgradient based optimization problem
Fields
cost
– the function $F$ to be minimizedsubgradient
– a function returning a subgradient $\partial F$ of $F$
Constructor
ManifoldSubgradientObjective(f, ∂f)
Generate the ManifoldSubgradientObjective
for a subgradient objective, i.e. a (cost) function f(M, p)
and a function ∂f(M, p)
that returns a not necessarily deterministic element from the subdifferential at p
on a manifold M
.
Access Functions
Manopt.get_subgradient
— Functionget_subgradient(amp::AbstractManoptProblem, p)
get_subgradient!(amp::AbstractManoptProblem, X, p)
evaluate the subgradient of an AbstractManoptProblem
amp
at point p
.
The evaluation is done in place of X
for the !
-variant. The result might not be deterministic, one element of the subdifferential is returned.
X = get_subgradient(M;;AbstractManifold, sgo::ManifoldSubgradientObjective, p)
get_subgradient!(M;;AbstractManifold, X, sgo::ManifoldSubgradientObjective, p)
Evaluate the (sub)gradient of a ManifoldSubgradientObjective
sgo
at the point p
.
The evaluation is done in place of X
for the !
-variant. The result might not be deterministic, one element of the subdifferential is returned.
Proximal Map Objective
Manopt.ManifoldProximalMapObjective
— TypeManifoldProximalMapObjective{E<:AbstractEvaluationType, TC, TP, V <: Vector{<:Integer}} <: AbstractManifoldCostObjective{E, TC}
specify a problem for solvers based on the evaluation of proximal map(s).
Fields
cost
- a function $F:\mathcal M→ℝ$ to minimizeproxes
- proximal maps $\operatorname{prox}_{λ\varphi}:\mathcal M→\mathcal M$ as functions(M, λ, p) -> q
.number_of_proxes
- (ones(length(proxes))
` number of proximal Maps per function, e.g. if one of the maps is a combined one such that the proximal Maps functions return more than one entry per function, you have to adapt this value. if not speciifed, it is set to one prox per function.
See also
Access Functions
Manopt.get_proximal_map
— Functionq = get_proximal_map(M::AbstractManifold, mpo::ManifoldProximalMapObjective, λ, p, i)
get_proximal_map!(M::AbstractManifold, q, mpo::ManifoldProximalMapObjective, λ, p, i)
evaluate the i
th proximal map of ManifoldProximalMapObjective p
at the point p
of p.M
with parameter $λ>0$.
Hessian Objective
Manopt.ManifoldHessianObjective
— TypeManifoldHessianObjective{T<:AbstractEvaluationType,C,G,H,Pre} <: AbstractManifoldGradientObjective{T}
specify a problem for hessian based algorithms.
Fields
cost
: a function $F:\mathcal M→ℝ$ to minimizegradient
: the gradient $\operatorname{grad}F:\mathcal M → \mathcal T\mathcal M$ of the cost function $F$hessian
: the hessian $\operatorname{Hess}F(x)[⋅]: \mathcal T_{x} \mathcal M → \mathcal T_{x} \mathcal M$ of the cost function $F$preconditioner
: the symmetric, positive definite preconditioner as an approximation of the inverse of the Hessian of $f$, i.e. as a map with the same input variables as thehessian
.
Depending on the AbstractEvaluationType
T
the gradient and can have to forms
- as a function
(M, p) -> X
and(M, p, X) -> Y
, resp. i.e. anAllocatingEvaluation
- as a function
(M, X, p) -> X
and (M, Y, p, X), resp., i.e. anInplaceEvaluation
Constructor
ManifoldHessianObjective(f, grad_f, Hess_f, preconditioner = (M, p, X) -> X;
evaluation=AllocatingEvaluation())
See also
Access functions
Manopt.get_hessian
— FunctionY = get_hessian(amp::AbstractManoptProblem{T}, p, X)
get_hessian!(amp::AbstractManoptProblem{T}, Y, p, X)
evaluate the Hessian of an AbstractManoptProblem
amp
at p
applied to a tangent vector X
, i.e. compute $\operatorname{Hess}f(q)[X]$, which can also happen in-place of Y
.
Manopt.get_preconditioner
— Functionget_preconditioner(amp::AbstractManoptProblem, p, X)
evaluate the symmetric, positive definite preconditioner (approximation of the inverse of the Hessian of the cost function f
) of a AbstractManoptProblem
amp
s objective at the point p
applied to a tangent vector X
.
get_preconditioner(M::AbstractManifold, mho::ManifoldHessianObjective, p, X)
evaluate the symmetric, positive definite preconditioner (approximation of the inverse of the Hessian of the cost function F
) of a ManifoldHessianObjective
mho
at the point p
applied to a tangent vector X
.
Primal-Dual based Objetives
Manopt.AbstractPrimalDualManifoldObjective
— TypeAbstractPrimalDualManifoldObjective{E<:AbstractEvaluationType,C,P} <: AbstractManifoldCostObjective{E,C}
A common abstract super type for objectives that consider primal-dual problems.
Manopt.PrimalDualManifoldObjective
— TypePrimalDualManifoldObjective{E<:AbstractEvaluationType} <: AbstractPrimalDualManifoldObjective{E}
Describes an Objective linearized or exact Chambolle-Pock algorithm.[BergmannHerzogSilvaLouzeiroTenbrinckVidalNunez2020][ChambollePock2011]
Fields
All fields with !! can either be mutating or nonmutating functions, which should be set depenting on the parameter T <: AbstractEvaluationType
.
cost
$F + G(Λ(⋅))$ to evaluate interims cost function valueslinearized_forward_operator!!
linearized operator for the forward operation in the algorithm $DΛ$linearized_adjoint_operator!!
The adjoint differential $(DΛ)^* : \mathcal N → T\mathcal M$prox_f!!
the proximal map belonging to $f$prox_G_dual!!
the proximal map belonging to $g_n^*$Λ!!
– (fordward_operator
) the forward operator (if given) $Λ: \mathcal M → \mathcal N$
Either the linearized operator $DΛ$ or $Λ$ are required usually.
Constructor
PrimalDualManifoldObjective(cost, prox_f, prox_G_dual, adjoint_linearized_operator;
linearized_forward_operator::Union{Function,Missing}=missing,
Λ::Union{Function,Missing}=missing,
evaluation::AbstractEvaluationType=AllocatingEvaluation()
)
The last optional argument can be used to provide the 4 or 5 functions as allocating or mutating (in place computation) ones. Note that the first argument is always the manifold under consideration, the mutated one is the second.
Manopt.PrimalDualManifoldSemismoothNewtonObjective
— TypePrimalDualManifoldSemismoothNewtonObjective{E<:AbstractEvaluationType, TC, LO, ALO, PF, DPF, PG, DPG, L} <: AbstractPrimalDualManifoldObjective{E, TC, PF}
Describes a Problem for the Primal-dual Riemannian semismooth Newton algorithm. [DiepeveenLellmann2021]
Fields
cost
$F + G(Λ(⋅))$ to evaluate interims cost function valueslinearized_operator
the linearization $DΛ(⋅)[⋅]$ of the operator $Λ(⋅)$.linearized_adjoint_operator
The adjoint differential $(DΛ)^* \colon \mathcal N \to T\mathcal M$prox_F
the proximal map belonging to $f$diff_prox_F
the (Clarke Generalized) differential of the proximal maps of $F$prox_G_dual
the proximal map belonging to $g_n^*$diff_prox_dual_G
the (Clarke Generalized) differential of the proximal maps of $G^\ast_n$Λ
– the exact forward operator. This operator is required ifΛ(m)=n
does not hold.
Constructor
PrimalDualManifoldSemismoothNewtonObjective(cost, prox_F, prox_G_dual, forward_operator, adjoint_linearized_operator,Λ)
Access functions
Manopt.adjoint_linearized_operator
— FunctionX = adjoint_linearized_operator(N::AbstractManifold, apdmo::AbstractPrimalDualManifoldObjective, m, n, Y)
adjoint_linearized_operator(N::AbstractManifold, X, apdmo::AbstractPrimalDualManifoldObjective, m, n, Y)
Evaluate the adjoint of the linearized forward operator of $(DΛ(m))^*[Y]$ stored within the AbstractPrimalDualManifoldObjective
(in place of X
). Since $Y∈T_n\mathcal N$, both $m$ and $n=Λ(m)$ are necessary arguments, mainly because the forward operator $Λ$ might be missing
in p
.
Manopt.forward_operator
— Functionq = forward_operator(M::AbstractManifold, N::AbstractManifold, apdmo::AbstractPrimalDualManifoldObjective, p)
forward_operator!(M::AbstractManifold, N::AbstractManifold, q, apdmo::AbstractPrimalDualManifoldObjective, p)
Evaluate the forward operator of $Λ(x)$ stored within the TwoManifoldProblem
(in place of q
).
Manopt.get_differential_dual_prox
— Functionη = get_differential_dual_prox(N::AbstractManifold, pdsno::PrimalDualManifoldSemismoothNewtonObjective, n, τ, X, ξ)
get_differential_dual_prox!(N::AbstractManifold, pdsno::PrimalDualManifoldSemismoothNewtonObjective, η, n, τ, X, ξ)
Evaluate the differential proximal map of $G_n^*$ stored within PrimalDualManifoldSemismoothNewtonObjective
\[D\operatorname{prox}_{τG_n^*}(X)[ξ]\]
which can also be computed in place of η
.
Manopt.get_differential_primal_prox
— Functiony = get_differential_primal_prox(M::AbstractManifold, pdsno::PrimalDualManifoldSemismoothNewtonObjective σ, x)
get_differential_primal_prox!(p::TwoManifoldProblem, y, σ, x)
Evaluate the differential proximal map of $F$ stored within AbstractPrimalDualManifoldObjective
\[D\operatorname{prox}_{σF}(x)[X]\]
which can also be computed in place of y
.
Manopt.get_dual_prox
— FunctionY = get_dual_prox(N::AbstractManifold, apdmo::AbstractPrimalDualManifoldObjective, n, τ, X)
get_dual_prox!(N::AbstractManifold, apdmo::AbstractPrimalDualManifoldObjective, Y, n, τ, X)
Evaluate the proximal map of $g_n^*$ stored within AbstractPrimalDualManifoldObjective
\[ Y = \operatorname{prox}_{τG_n^*}(X)\]
which can also be computed in place of Y
.
Manopt.get_primal_prox
— Functionq = get_primal_prox(M::AbstractManifold, p::AbstractPrimalDualManifoldObjective, σ, p)
get_primal_prox!(M::AbstractManifold, p::AbstractPrimalDualManifoldObjective, q, σ, p)
Evaluate the proximal map of $F$ stored within AbstractPrimalDualManifoldObjective
\[\operatorname{prox}_{σF}(x)\]
which can also be computed in place of y
.
Manopt.linearized_forward_operator
— FunctionY = linearized_forward_operator(M::AbstractManifold, N::AbstractManifold, apdmo::AbstractPrimalDualManifoldObjective, m, X, n)
linearized_forward_operator!(M::AbstractManifold, N::AbstractManifold, Y, apdmo::AbstractPrimalDualManifoldObjective, m, X, n)
Evaluate the linearized operator (differential) $DΛ(m)[X]$ stored within the AbstractPrimalDualManifoldObjective
(in place of Y
), where n = Λ(m)
.
Constrained Objective
Besides the AbstractEvaluationType
there is one further property to distinguish among constraint functions, especially the gradients of the constraints.
Manopt.ConstraintType
— TypeConstraintType
An abstract type to represent different forms of representing constraints
Manopt.FunctionConstraint
— TypeFunctionConstraint <: ConstraintType
A type to indicate that constraints are implemented one whole functions, e.g. $g(p) ∈ \mathbb R^m$.
Manopt.VectorConstraint
— TypeVectorConstraint <: ConstraintType
A type to indicate that constraints are implemented a vector of functions, e.g. $g_i(p) ∈ \mathbb R, i=1,…,m$.
The ConstraintType
is a parameter of the corresponding Objective.
Manopt.ConstrainedManifoldObjective
— TypeConstrainedManifoldObjective{T<:AbstractEvaluationType, C <: ConstraintType Manifold} <: AbstractManifoldObjective{T}
Describes the constrained objective
\[\begin{aligned} \operatorname*{arg\,min}_{p ∈\mathcal{M}} & f(p)\\ \text{subject to } &g_i(p)\leq0 \quad \text{ for all } i=1,…,m,\\ \quad &h_j(p)=0 \quad \text{ for all } j=1,…,n. \end{aligned}\]
It consists of
- an cost function $f(p)$
- the gradient of $f$, $\operatorname{grad}f(p)$
AbstractManifoldGradientObjective
- inequality constraints $g(p)$, either a function
g
returning a vector or a vector[g1, g2,...,gm]
of functions. - equality constraints $h(p)$, either a function
h
returning a vector or a vector[h1, h2,...,hn]
of functions. - gradient(s) of the inequality constraints $\operatorname{grad}g(p) ∈ (T_p\mathcal M)^m$, either a function or a vector of functions.
- gradient(s) of the equality constraints $\operatorname{grad}h(p) ∈ (T_p\mathcal M)^n$, either a function or a vector of functions.
There are two ways to specify the constraints $g$ and $h$.
- as one
Function
returning a vector in $\mathbb R^m$ and $\mathbb R^n$ respecively. This might be easier to implement but requires evaluating all constraints even if only one is needed. - as a
AbstractVector{<:Function}
where each function returns a real number. This requires each constrant to be implemented as a single function, but it is possible to evaluate also only a single constraint.
The gradients $\operatorname{grad}g$, $\operatorname{grad}h$ have to follow the same form. Additionally they can be implemented as in-place functions or as allocating ones. The gradient $\operatorname{grad}F$ has to be the same kind. This difference is indicated by the evaluation
keyword.
Constructors
ConstrainedManifoldObjective(f, grad_f, g, grad_g, h, grad_h;
evaluation=AllocatingEvaluation()
)
Where f, g, h
describe the cost, inequality and equality constraints, respecitvely, as described above and grad_f, grad_g, grad_h
are the corresponding gradient functions in one of the 4 formats. If the objective does not have inequality constraints, you can set G
and gradG
no nothing
. If the problem does not have equality constraints, you can set H
and gradH
no nothing
or leave them out.
ConstrainedManifoldObjective(M::AbstractManifold, F, gradF;
G=nothing, gradG=nothing, H=nothing, gradH=nothing;
evaluation=AllocatingEvaluation()
)
A keyword argument variant of the constructor above, where you can leave out either G
and gradG
or H
and gradH
but not both.
Access functions
Manopt.get_constraints
— Functionget_constraints(M::AbstractManifold, co::ConstrainedManifoldObjective, p)
Return the vector $(g_1(p),...g_m(p),h_1(p),...,h_n(p))$ from the ConstrainedManifoldObjective
P
containing the values of all constraints at p
.
Manopt.get_equality_constraint
— Functionget_equality_constraint(M::AbstractManifold, co::ConstrainedManifoldObjective, p, j)
evaluate the j
th equality constraint $(h(p))_j$ or $h_j(p)$.
For the FunctionConstraint
representation this still evaluates all constraints.
Manopt.get_equality_constraints
— Functionget_equality_constraints(M::AbstractManifold, co::ConstrainedManifoldObjective, p)
evaluate all equality constraints $h(p)$ of $\bigl(h_1(p), h_2(p),\ldots,h_p(p)\bigr)$ of the ConstrainedManifoldObjective
$P$ at $p$.
Manopt.get_inequality_constraint
— Functionget_inequality_constraint(M::AbstractManifold, co::ConstrainedManifoldObjective, p, i)
evaluate one equality constraint $(g(p))_i$ or $g_i(p)$.
For the FunctionConstraint
representation this still evaluates all constraints.
Manopt.get_inequality_constraints
— Functionget_inequality_constraints(M::AbstractManifold, co::ConstrainedManifoldObjective, p)
Evaluate all inequality constraints $g(p)$ or $\bigl(g_1(p), g_2(p),\ldots,g_m(p)\bigr)$ of the ConstrainedManifoldObjective
$P$ at $p$.
Manopt.get_grad_equality_constraint
— Functionget_grad_equality_constraint(M::AbstractManifold, co::ConstrainedManifoldObjective, p, j)
evaluate the gradient of the j
th equality constraint $(\operatorname{grad} h(p))_j$ or $\operatorname{grad} h_j(x)$.
For the FunctionConstraint
variant of the problem, this function still evaluates the full gradient. For the InplaceEvaluation
and FunctionConstraint
of the problem, this function currently also calls get_equality_constraints
, since this is the only way to determine the number of cconstraints. It also allocates a full tangent vector.
Manopt.get_grad_equality_constraints
— Functionget_grad_equality_constraints(M::AbstractManifold, co::ConstrainedManifoldObjective, p)
eevaluate all gradients of the equality constraints $\operatorname{grad} h(x)$ or $\bigl(\operatorname{grad} h_1(x), \operatorname{grad} h_2(x),\ldots, \operatorname{grad}h_n(x)\bigr)$ of the ConstrainedManifoldObjective
P
at p
.
for the InplaceEvaluation
and FunctionConstraint
variant of the problem, this function currently also calls get_equality_constraints
, since this is the only way to determine the number of cconstraints.
Manopt.get_grad_equality_constraints!
— Functionget_grad_equality_constraints!(M::AbstractManifold, X, co::ConstrainedManifoldObjective, p)
evaluate all gradients of the equality constraints $\operatorname{grad} h(p)$ or $\bigl(\operatorname{grad} h_1(p), \operatorname{grad} h_2(p),\ldots,\operatorname{grad} h_n(p)\bigr)$ of the ConstrainedManifoldObjective
$P$ at $p$ in place of X
, which is a vector of
n
` tangent vectors.
Manopt.get_grad_equality_constraint!
— Functionget_grad_equality_constraint!(M::AbstractManifold, X, co::ConstrainedManifoldObjective, p, j)
Evaluate the gradient of the j
th equality constraint $(\operatorname{grad} h(x))_j$ or $\operatorname{grad} h_j(x)$ in place of $X$
For the FunctionConstraint
variant of the problem, this function still evaluates the full gradient. For the InplaceEvaluation
of the FunctionConstraint
of the problem, this function currently also calls get_inequality_constraints
, since this is the only way to determine the number of cconstraints and allocates a full vector of tangent vectors
Manopt.get_grad_inequality_constraint
— Functionget_grad_inequality_constraint(M::AbstractManifold, co::ConstrainedManifoldObjective, p, i)
Evaluate the gradient of the i
th inequality constraints $(\operatorname{grad} g(x))_i$ or $\operatorname{grad} g_i(x)$.
For the FunctionConstraint
variant of the problem, this function still evaluates the full gradient. For the InplaceEvaluation
and FunctionConstraint
of the problem, this function currently also calls get_inequality_constraints
, since this is the only way to determine the number of cconstraints.
Manopt.get_grad_inequality_constraint!
— Functionget_grad_inequality_constraint!(P, X, p, i)
Evaluate the gradient of the i
th inequality constraints $(\operatorname{grad} g(x))_i$ or $\operatorname{grad} g_i(x)$ of the ConstrainedManifoldObjective
P
in place of $X$
For the FunctionConstraint
variant of the problem, this function still evaluates the full gradient. For the InplaceEvaluation
and FunctionConstraint
of the problem, this function currently also calls get_inequality_constraints
,
since this is the only way to determine the number of cconstraints. evaluate all gradients of the inequality constraints $\operatorname{grad} h(x)$ or $\bigl(g_1(x), g_2(x),\ldots,g_m(x)\bigr)$ of the ConstrainedManifoldObjective
$p$ at $x$ in place of X
, which is a vector of
m
` tangent vectors .
Manopt.get_grad_inequality_constraints
— Functionget_grad_inequality_constraints(M::AbstractManifold, co::ConstrainedManifoldObjective, p)
evaluate all gradients of the inequality constraints $\operatorname{grad} g(p)$ or $\bigl(\operatorname{grad} g_1(p), \operatorname{grad} g_2(p),…,\operatorname{grad} g_m(p)\bigr)$ of the ConstrainedManifoldObjective
$P$ at $p$.
for the InplaceEvaluation
and FunctionConstraint
variant of the problem, this function currently also calls get_equality_constraints
, since this is the only way to determine the number of cconstraints.
Manopt.get_grad_inequality_constraints!
— Functionget_grad_inequality_constraints!(M::AbstractManifold, X, co::ConstrainedManifoldObjective, p)
evaluate all gradients of the inequality constraints $\operatorname{grad} g(x)$ or $\bigl(\operatorname{grad} g_1(x), \operatorname{grad} g_2(x),\ldots,\operatorname{grad} g_m(x)\bigr)$ of the ConstrainedManifoldObjective
P
at p
in place of X
, which is a vector of $m$ tangent vectors.
Cache Objective
Since single function calls, e.g. to the cost or the gradient, might be expensive, a simple cache objective exists as a decorator, that caches one cost value or gradient.
This feature was just recently introduced in Manopt 0.4 and might still be a little unstable. The cache::Symbol=
keyword argument of the solvers might be extended or still change slightly for example.
Manopt.SimpleCacheObjective
— Type SimpleCacheObjective{O<:AbstractManifoldGradientObjective{E,TC,TG}, P, T,C} <: AbstractManifoldGradientObjective{E,TC,TG}
Provide a simple cache for an AbstractManifoldGradientObjective
that is for a given point p
this cache stores a point p
and a gradient $\operatorname{grad} f(p)$ in X
as well as a cost value $f(p)$ in c
.
Both X
and c
are accompanied by booleans to keep track of their validity.
Constructor
SimpleCacheObjective(M::AbstractManifold, obj::AbstractManifoldGradientObjective; kwargs...)
Keyword
p
(rand(M)
) – a point on the manifold to initialize the cache withX
(get_gradient(M, obj, p)
orzero_vector(M,p)
) – a tangent vector to store the gradient in, see alsoinitialize
c
(get_cost(M, obj, p)
or0.0
) – a value to store the cost function ininitialize
initialized
(true
) – whether to initialize the cachedX
andc
or not.
Manopt.objective_cache_factory
— Functionobjective_cache_factory(M::AbstractManifold, o::AbstractManifoldObjective, cache::Symbol)
Generate a cached variant of the AbstractManifoldObjective
o
on the AbstractManifold M
based on the symbol cache
.
The following caches are available
:Simple
generates aSimpleCacheObjective
objective_cache_factory(M::AbstractManifold, o::AbstractManifoldObjective, cache::Tuple{Symbol, Array}Symbol)
Generate a cached variant of the AbstractManifoldObjective
o
on the AbstractManifold M
based on the symbol cache[1]
, where the second element cache[2]
is an array of further arguments for the cache and the third is passed down as keyword arguments.
For all availabel caches see the simpler variant with symbols.
- BergmannHerzogSilvaLouzeiroTenbrinckVidalNunez2020
R. Bergmann, R. Herzog, M. Silva Louzeiro, D. Tenbrinck, J. Vidal-Núñez: Fenchel Duality Theory and a Primal-Dual Algorithm on Riemannian Manifolds, Foundations of Computational Mathematics, 2021. doi: 10.1007/s10208-020-09486-5 arXiv: 1908.02022
- ChambollePock2011
A. Chambolle, T. Pock: A first-order primal-dual algorithm for convex problems with applications to imaging, Journal of Mathematical Imaging and Vision 40(1), 120–145, 2011. doi: 10.1007/s10851-010-0251-1
- DiepeveenLellmann2021
W. Diepeveen, J. Lellmann: An Inexact Semismooth Newton Method on Riemannian Manifolds with Application to Duality-Based Total Variation Denoising, SIAM Journal on Imaging Sciences, 2021. doi: 10.1137/21M1398513