Problems
A problem usually contains its cost function and provides an implementation to access the cost
Manopt.Problem — TypeProblem{T<:AbstractEvaluationType}Describe the problem that should be optimized by stating all properties, that do not change during an optimization or that are dependent of a certain solver.
The parameter T can be used to distinguish problems with different representations or implementations. The default parameter AllocatingEvaluation, which might be slower but easier to use. The usually faster parameter value is MutatingEvaluation
See Options for the changing and solver dependent properties.
Manopt.get_cost — Functionget_cost(P::Problem, p)evaluate the cost function F stored within a Problem P at the point p.
A problem can be of different type, more specifically, whether its containing functions, for example to compute the gradient, work with allocation or without it. To be precise, an allocation function X = gradF(x) allocates memory for its result X, while gradF!(X,x) does not.
Manopt.AbstractEvaluationType — TypeAbstractEvaluationTypeAn abstract type to specify the kind of evaluation a Problem supports.
Manopt.AllocatingEvaluation — TypeAllocatingEvaluation <: AbstractEvaluationTypeA parameter for a Problem indicating that the problem uses functions that allocate memory for their result, i.e. they work out of place.
Manopt.MutatingEvaluation — TypeMutatingEvaluationA parameter for a Problem indicating that the problem uses functions that do not allocate memory but work on their input, i.e. in place.
Cost based problem
Manopt.CostProblem — TypeCostProblem{T, Manifold, TCost} <: Problem{T}speficy a problem for solvers just based on cost functions, i.e. gradient free ones.
Fields
M– a manifold $\mathcal M$cost– a function $F: \mathcal M → ℝ$ to minimize
Constructors
CostProblem(M, cost; evaluation=AllocatingEvaluation())Generate a problem. While this Problem does not have any allocating functions, the type T can be set for consistency reasons with other problems.
See also
Gradient based problem
Manopt.AbstractGradientProblem — TypeAbstractGradientProblem{T<:AbstractEvaluationType} <: Problem{T}An abstract type for all functions that provide a (full) gradient, where T is a AbstractEvaluationType for the gradient function.
Manopt.GradientProblem — TypeGradientProblem{T<:AbstractEvaluationType} <: AbstractGradientProblem{T}specify a problem for gradient based algorithms.
Fields
M– a manifold $\mathcal M$cost– a function $F: \mathcal M → ℝ$ to minimizegradient!!– the gradient $\operatorname{grad}F:\mathcal M → \mathcal T\mathcal M$ of the cost function $F$.
Depending on the AbstractEvaluationType T the gradient has to be provided
- as a function
x -> Xthat allocates memory forXitself for anAllocatingEvaluation - as a function
(X,x) -> Xthat work in place ofXfor anMutatingEvaluation
Constructors
GradientProblem(M, cost, gradient; evaluation=AllocatingEvaluation())See also
Manopt.StochasticGradientProblem — TypeStochasticGradientProblem <: ProblemA stochastic gradient problem consists of
- a
Manifold M - a(n optional) cost function ``f(x) = \displaystyle\sum{i=1}^n fi(x)
- an array of gradients, i.e. a function that returns and array or an array of functions
$\{\operatorname{grad}f_i\}_{i=1}^n$, where both variants can be given in the allocating variant and the array of function may also be provided as mutating functions (X,x) -> X.
Constructors
StochasticGradientProblem(M::AbstractManifold, gradF::Function;
cost=Missing(), evaluation=AllocatingEvaluation()
)
StochasticGradientProblem(M::AbstractManifold, gradF::AbstractVector{<:Function};
cost=Missing(), evaluation=AllocatingEvaluation()
)Create a Stochastic gradient problem with an optional cost and the gradient either as one function (returning an array) or a vector of functions.
Manopt.get_gradient — Functionget_gradient(O::Options)return the (last stored) gradient within Options`O. By default also undecorates the options beforehand
get_gradient(p::AbstractGradientProblem{T},x)
get_gradient!(p::AbstractGradientProblem{T}, X, x)evaluate the gradient of a AbstractGradientProblem{T}p at the point x.
The evaluation is done in place of X for the !-variant. The T=AllocatingEvaluation problem might still allocate memory within. When the non-mutating variant is called with a T=MutatingEvaluation memory for the result is allocated.
get_gradient(p::StochasticGradientProblem, k, x)
get_gradient!(p::StochasticGradientProblem, Y, k, x)Evaluate one of the summands gradients $\operatorname{grad}f_k$, $k∈\{1,…,n\}$, at x (in place of Y).
Note that for the MutatingEvaluation based problem and a single function for the stochastic gradient mutating variant is not available, since it would require too many allocations.
get_gradient(P::AlternatingGradientProblem, x)
get_gradient!(P::AlternatingGradientProblem, Y, x)Evaluate all summands gradients at a point x on the ProductManifold M (in place of Y)
get_gradient(p::AlternatingGradientProblem, k, x)
get_gradient!(p::AlternatingGradientProblem, Y, k, x)Evaluate one of the component gradients $\operatorname{grad}f_k$, $k∈\{1,…,n\}$, at x (in place of Y).
Manopt.get_gradients — Functionget_gradients(P::StochasticGradientProblem, x)
get_gradients!(P::StochasticGradientProblem, Y, x)Evaluate all summands gradients $\{\operatorname{grad}f_i\}_{i=1}^n$ at x (in place of Y).
Note that for the MutatingEvaluation based problem and a single function for the stochastic gradient, the allocating variant is not available.
Subgradient based problem
Manopt.SubGradientProblem — TypeSubGradientProblem <: ProblemA structure to store information about a subgradient based optimization problem
Fields
manifold– a Manifoldcost– the function $F$ to be minimizedsubgradient– a function returning a subgradient $\partial F$ of $F$
Constructor
SubGradientProblem(M, f, ∂f)Generate the [Problem] for a subgradient problem, i.e. a function f on the manifold M and a function ∂f that returns an element from the subdifferential at a point.
Manopt.get_subgradient — Functionget_subgradient(p, q)
get_subgradient!(p, X, q)Evaluate the (sub)gradient of a SubGradientProblemp at the point q.
The evaluation is done in place of X for the !-variant. The T=AllocatingEvaluation problem might still allocate memory within. When the non-mutating variant is called with a T=MutatingEvaluation memory for the result is allocated.
Constrained based problem
Manopt.ConstrainedProblem — TypeConstrainedProblem{T, Manifold} <: AbstractGradientProblem{T}Describes the constrained problem
\[\begin{aligned} \operatorname*{arg\,min}_{p ∈\mathcal{M}} & f(p)\\ \text{subject to } &g_i(p)\leq0 \quad \text{ for all } i=1,…,m,\\ \quad &h_j(p)=0 \quad \text{ for all } j=1,…,n. \end{aligned}\]
It consists of
- an
AbstractManifold M - an cost function $f(p)$
- the gradient of $f$, $\operatorname{grad}f(p)$ (cf.
AbstractGradientProblem) - inequality constraints $g(p)$, either a function
greturning a vector or a vector[g1, g2,...,gm]of functions. - equality constraints $h(p)$, either a function
hreturning a vector or a vector[h1, h2,...,hn]of functions. - gradient(s) of the inequality constraints $\operatorname{grad}g(p) ∈ (T_p\mathcal M)^m$, either a function or a vector of functions.
- gradient(s) of the equality constraints $\operatorname{grad}h(p) ∈ (T_p\mathcal M)^n$, either a function or a vector of functions.
There are two ways to specify the constraints $g$ and $h$.
- as one
Functionreturning a vector in $\mathbb R^m$ and $\mathbb R^n$ respecively. This might be easier to implement but requires evaluating all constraints even if only one is needed. - as a
AbstractVector{<:Function}where each function returns a real number. This requires each constrant to be implemented as a single function, but it is possible to evaluate also only a single constraint.
The gradients $\operatorname{grad}g$, $\operatorname{grad}h$ have to follow the same form. Additionally they can be implemented as in-place functions or as allocating ones. The gradient $\operatorname{grad}F$ has to be the same kind. This difference is indicated by the evaluation keyword.
Constructors
ConstrainedProblem(M::AbstractManifold, F, gradF, G, gradG, H, gradH;
evaluation=AllocatingEvaluation()
)Where F, G, H describe the cost, inequality and equality constraints as described above and gradF, gradG, gradH are the corresponding gradient functions in one of the 4 formats. If the problem does not have inequality constraints, you can set G and gradG no nothing. If the problem does not have equality constraints, you can set H and gradH no nothing or leave them out.
ConstrainedProblem(M::AbstractManifold, F, gradF;
G=nothing, gradG=nothing, H=nothing, gradH=nothing;
evaluation=AllocatingEvaluation()
)A keyword argument variant of the constructor above, where you can leave out either G and gradG or H and gradH but not both.
Manopt.FunctionConstraint — TypeFunctionConstraint <: ConstraintTypeA type to indicate that constraints are implemented one whole functions, e.g. $g(p) ∈ \mathbb R^m$.
Manopt.VectorConstraint — TypeVectorConstraint <: ConstraintTypeA type to indicate that constraints are implemented a vector of functions, e.g. $g_i(p) ∈ \mathbb R, i=1,…,m$.
Manopt.ConstraintType — TypeConstraintTypeAn abstract type to represent different forms of representing constraints
Manopt.get_constraints — Functionget_constraints(P::ConstrainedProblem, p)Return the vector $(g_1(p),...g_m(p),h_1(p),...,h_n(p))$ from the ConstrainedProblem P containing the values of all constraints at p.
Manopt.get_equality_constraint — Functionget_equality_constraint(P::ConstrainedProblem, p, j)evaluate the jth equality constraint $(h(p))_j$ or $h_j(p)$.
For the FunctionConstraint representation this still evaluates all constraints.
Manopt.get_equality_constraints — Functionget_equality_constraints(P::ConstrainedProblem, p)evaluate all equality constraints $h(p)$ of $\bigl(h_1(p), h_2(p),\ldots,h_p(p)\bigr)$ of the ConstrainedProblem $P$ at $p$.
Manopt.get_grad_equality_constraint — Functionget_grad_equality_constraint(P, p, j)evaluate the gradient of the j th equality constraint $(\operatorname{grad} h(p))_j$ or $\operatorname{grad} h_j(x)$.
For the FunctionConstraint variant of the problem, this function still evaluates the full gradient. For the MutatingEvaluation and FunctionConstraint of the problem, this function currently also calls get_equality_constraints, since this is the only way to determine the number of cconstraints. It also allocates a full tangent vector.
Manopt.get_grad_equality_constraint! — Functionget_grad_equality_constraint!(P, X, p, j)Evaluate the gradient of the jth equality constraint $(\operatorname{grad} h(x))_j$ or $\operatorname{grad} h_j(x)$ in place of $X$
For the FunctionConstraint variant of the problem, this function still evaluates the full gradient. For the MutatingEvaluation of the FunctionConstraint of the problem, this function currently also calls get_inequality_constraints, since this is the only way to determine the number of cconstraints and allocates a full vector of tangent vectors
Manopt.get_grad_equality_constraints — Functionget_grad_equality_constraints(P, p)eevaluate all gradients of the equality constraints $\operatorname{grad} h(x)$ or $\bigl(\operatorname{grad} h_1(x), \operatorname{grad} h_2(x),\ldots, \operatorname{grad}h_n(x)\bigr)$ of the ConstrainedProblem P at p.
for the MutatingEvaluation and FunctionConstraint variant of the problem, this function currently also calls get_equality_constraints, since this is the only way to determine the number of cconstraints.
Manopt.get_grad_equality_constraints! — Functionget_grad_equality_constraints!(P, X, p)evaluate all gradients of the equality constraints $\operatorname{grad} h(p)$ or $\bigl(\operatorname{grad} h_1(p), \operatorname{grad} h_2(p),\ldots,\operatorname{grad} h_n(p)\bigr)$ of the ConstrainedProblem $P$ at $p$ in place of X, which is a vector ofn` tangent vectors.
Manopt.get_inequality_constraint — Functionget_inequality_constraint(P::ConstrainedProblem, p, i)evaluate one equality constraint $(g(p))_i$ or $g_i(p)$.
For the FunctionConstraint representation this still evaluates all constraints.
Manopt.get_inequality_constraints — Functionget_inequality_constraints(P::ConstrainedProblem, p)Evaluate all inequality constraints $g(p)$ or $\bigl(g_1(p), g_2(p),\ldots,g_m(p)\bigr)$ of the ConstrainedProblem $P$ at $p$.
Manopt.get_grad_inequality_constraint — Functionget_grad_inequality_constraint(P, p, i)Evaluate the gradient of the i th inequality constraints $(\operatorname{grad} g(x))_i$ or $\operatorname{grad} g_i(x)$.
For the FunctionConstraint variant of the problem, this function still evaluates the full gradient. For the MutatingEvaluation and FunctionConstraint of the problem, this function currently also calls get_inequality_constraints, since this is the only way to determine the number of cconstraints.
Manopt.get_grad_inequality_constraint! — Functionget_grad_inequality_constraint!(P, X, p, i)Evaluate the gradient of the ith inequality constraints $(\operatorname{grad} g(x))_i$ or $\operatorname{grad} g_i(x)$ of the ConstrainedProblem P in place of $X$
For the FunctionConstraint variant of the problem, this function still evaluates the full gradient. For the MutatingEvaluation and FunctionConstraint of the problem, this function currently also calls get_inequality_constraints,
since this is the only way to determine the number of cconstraints. evaluate all gradients of the inequality constraints $\operatorname{grad} h(x)$ or $\bigl(g_1(x), g_2(x),\ldots,g_m(x)\bigr)$ of the ConstrainedProblem $p$ at $x$ in place of X, which is a vector ofm` tangent vectors .
Manopt.get_grad_inequality_constraints — Functionget_grad_inequality_constraints(P, p)evaluate all gradients of the inequality constraints $\operatorname{grad} g(p)$ or $\bigl(\operatorname{grad} g_1(p), \operatorname{grad} g_2(p),…,\operatorname{grad} g_m(p)\bigr)$ of the ConstrainedProblem $P$ at $p$.
for the MutatingEvaluation and FunctionConstraint variant of the problem, this function currently also calls get_equality_constraints, since this is the only way to determine the number of cconstraints.
Manopt.get_grad_inequality_constraints! — Functionget_grad_inequality_constraints!(P, X, p)evaluate all gradients of the inequality constraints $\operatorname{grad} g(x)$ or $\bigl(\operatorname{grad} g_1(x), \operatorname{grad} g_2(x),\ldots,\operatorname{grad} g_m(x)\bigr)$ of the ConstrainedProblem P at p in place of X, which is a vector of $m$ tangent vectors.
Proximal Map(s) based problem
Manopt.ProximalProblem — TypeProximalProblem <: Problemspecify a problem for solvers based on the evaluation of proximal map(s).
Fields
M- a Riemannian manifoldcost- a function $F:\mathcal M→ℝ$ to minimizeproxes- proximal maps $\operatorname{prox}_{λ\varphi}:\mathcal M→\mathcal M$ as functions (λ,x) -> y, i.e. the prox parameter λ also belongs to the signature of the proximal map.number_of_proxes- (length(proxes)) number of proximal Maps, e.g. if one of the maps is a combined one such that the proximal Maps functions return more than one entry per function
See also
Manopt.get_proximal_map — Functionget_proximal_map(p,λ,x,i)evaluate the ith proximal map of ProximalProblem p at the point x of p.M with parameter $λ>0$.
Hessian based problem
Manopt.HessianProblem — TypeHessianProblem <: Problemspecify a problem for hessian based algorithms.
Fields
M: a manifold $\mathcal M$cost: a function $F:\mathcal M→ℝ$ to minimizegradient: the gradient $\operatorname{grad}F:\mathcal M → \mathcal T\mathcal M$ of the cost function $F$hessian: the hessian $\operatorname{Hess}F(x)[⋅]: \mathcal T_{x} \mathcal M → \mathcal T_{x} \mathcal M$ of the cost function $F$precon: the symmetric, positive definite preconditioner (approximation of the inverse of the Hessian of $F$)
See also
Manopt.get_hessian — Functionget_hessian(p::HessianProblem{T}, q, X)
get_hessian!(p::HessianProblem{T}, Y, q, X)evaluate the Hessian of a HessianProblem p at the point q applied to a tangent vector X, i.e. $\operatorname{Hess}f(q)[X]$.
The evaluation is done in place of Y for the !-variant. The T=AllocatingEvaluation problem might still allocate memory within. When the non-mutating variant is called with a T=MutatingEvaluation memory for the result is allocated.
Manopt.get_preconditioner — Functionget_preconditioner(p,x,ξ)evaluate the symmetric, positive definite preconditioner (approximation of the inverse of the Hessian of the cost function F) of a HessianProblem p at the point xapplied to a tangent vector ξ.
Primal dual based problem
Manopt.AbstractPrimalDualProblem — TypeAbstractPrimalDualProblem{T<:AbstractEvaluationType} <: Problem{T}An abstract type for primal-dual-based problems.
Manopt.PrimalDualProblem — TypePrimalDualProblem {T, mT <: AbstractManifold, nT <: AbstractManifold} <: AbstractPrimalDualProblemDescribes a Problem for the linearized or exact Chambolle-Pock algorithm.[BergmannHerzogSilvaLouzeiroTenbrinckVidalNunez2020][ChambollePock2011]
Fields
All fields with !! can either be mutating or nonmutating functions, which should be set depenting on the parameter T <: AbstractEvaluationType.
M,N– two manifolds $\mathcal M$, $\mathcal N$cost$F + G(Λ(⋅))$ to evaluate interims cost function valueslinearized_forward_operator!!linearized operator for the forward operation in the algorithm $DΛ$linearized_adjoint_operator!!The adjoint differential $(DΛ)^* : \mathcal N → T\mathcal M$prox_F!!the proximal map belonging to $f$prox_G_dual!!the proximal map belonging to $g_n^*$Λ!!– (fordward_operator) the forward operator (if given) $Λ: \mathcal M → \mathcal N$
Either $DΛ$ (for the linearized) or $Λ$ are required usually.
Constructor
LinearizedPrimalDualProblem(M, N, cost, prox_F, prox_G_dual, adjoint_linearized_operator;
linearized_forward_operator::Union{Function,Missing}=missing,
Λ::Union{Function,Missing}=missing,
evaluation::AbstractEvaluationType=AllocatingEvaluation()
)The last optional argument can be used to provide the 4 or 5 functions as allocating or mutating (in place computation) ones. Note that the first argument is always the manifold under consideration, the mutated one is the second.
Manopt.PrimalDualSemismoothNewtonProblem — TypePrimalDualSemismoothNewtonProblem {T <: AbstractEvaluationType,mT <: AbstractManifold, nT <: AbstractManifold} <: AbstractPrimalDualProblem{T}Describes a Problem for the Primal-dual Riemannian semismooth Newton algorithm. [DiepeveenLellmann2021]
Fields
M,N– two manifolds $\mathcal M$, $\mathcal N$cost$F + G(Λ(⋅))$ to evaluate interims cost function valueslinearized_operatorthe linearization $DΛ(⋅)[⋅]$ of the operator $Λ(⋅)$.linearized_adjoint_operatorThe adjoint differential $(DΛ)^* \colon \mathcal N \to T\mathcal M$prox_Fthe proximal map belonging to $f$diff_prox_Fthe (Clarke Generalized) differential of the proximal maps of $F$prox_G_dualthe proximal map belonging to $g_n^*$diff_prox_dual_Gthe (Clarke Generalized) differential of the proximal maps of $G^\ast_n$Λ– the exact forward operator. This operator is required ifΛ(m)=ndoes not hold.
Constructor
PrimalDualSemismoothNewtonProblem(M, N, cost, prox_F, prox_G_dual, forward_operator, adjoint_linearized_operator,Λ)Manopt.get_primal_prox — Functiony = get_primal_prox(p::AbstractPrimalDualProblem, σ, x)
get_primal_prox!(p::AbstractPrimalDualProblem, y, σ, x)Evaluate the proximal map of $F$ stored within AbstractPrimalDualProblem
\[\operatorname{prox}_{σF}(x)\]
which can also be computed in place of y.
Manopt.get_dual_prox — Functiony = get_dual_prox(p::AbstractPrimalDualProblem, n, τ, ξ)
get_dual_prox!(p::AbstractPrimalDualProblem, y, n, τ, ξ)Evaluate the proximal map of $G_n^*$ stored within AbstractPrimalDualProblem
\[\operatorname{prox}_{τG_n^*}(ξ)\]
which can also be computed in place of y.
Manopt.forward_operator — Functiony = forward_operator(p::AbstractPrimalDualProblem, x)
forward_operator!(p::AbstractPrimalDualProblem, y, x)Evaluate the forward operator of $Λ(x)$ stored within the AbstractPrimalDualProblem (in place of y).
Manopt.linearized_forward_operator — FunctionY = linearized_forward_operator(p::AbstractPrimalDualProblem, m X, n)
linearized_forward_operator!(p::AbstractPrimalDualProblem, Y, m, X, n)Evaluate the linearized operator (differential) $DΛ(m)[X]$ stored within the AbstractPrimalDualProblem (in place of Y), where n = Λ(m).
Manopt.adjoint_linearized_operator — FunctionX = adjoint_linearized_operator(p::AbstractPrimalDualProblem, m, n, Y)
adjoint_linearized_operator(p::AbstractPrimalDualProblem, X, m, n, Y)Evaluate the adjoint of the linearized forward operator of $(DΛ(m))^*[Y]$ stored within the AbstractPrimalDualProblem (in place of X). Since $Y∈T_n\mathcal N$, both $m$ and $n=Λ(m)$ are necessary arguments, mainly because the forward operator $Λ$ might be missing in p.
Manopt.get_differential_primal_prox — Functiony = get_differential_primal_prox(p::PrimalDualSemismoothNewtonProblem, σ, x)
get_differential_primal_prox!(p::PrimalDualSemismoothNewtonProblem, y, σ, x)Evaluate the differential proximal map of $F$ stored within PrimalDualSemismoothNewtonProblem
\[D\operatorname{prox}_{σF}(x)[X]\]
which can also be computed in place of y.
Manopt.get_differential_dual_prox — Functiony = get_differential_dual_prox(p::PrimalDualSemismoothNewtonProblem, n, τ, ξ, Ξ)
get_differential_dual_prox!(p::PrimalDualSemismoothNewtonProblem, y, n, τ, ξ, Ξ)Evaluate the differential proximal map of $G_n^*$ stored within PrimalDualSemismoothNewtonProblem
\[D\operatorname{prox}_{τG_n^*}(ξ)[Ξ]\]
which can also be computed in place of y.
Nonlinear least squares problem
Manopt.NonlinearLeastSquaresProblem — TypeNonlinearLeastSquaresProblem{T<:AbstractEvaluationType} <: Problem{T}A type for nonlinear least squares problems. T is a AbstractEvaluationType for the F and Jacobian functions.
Specify a nonlinear least squares problem
Fields
M– a manifold $\mathcal M$F– a function $F: \mathcal M → ℝ^d$ to minimizejacF!!– Jacobian of the function $F$jacB– the basis of tangent space used for computing the Jacobian.num_components– number of values returned byF(equal tod).
Depending on the AbstractEvaluationType T the function $F$ has to be provided:
- as a functions
(M::AbstractManifold, p) -> vthat allocates memory forvitself for anAllocatingEvaluation, - as a function
(M::AbstractManifold, v, p) -> vthat works in place ofvfor aMutatingEvaluation.
Also the Jacobian $jacF!!$ is required:
- as a functions
(M::AbstractManifold, p; basis_domain::AbstractBasis) -> vthat allocates memory forvitself for anAllocatingEvaluation, - as a function
(M::AbstractManifold, v, p; basis_domain::AbstractBasis) -> vthat works in place ofvfor anMutatingEvaluation.
Constructors
NonlinearLeastSquaresProblem(M, F, jacF, num_components; evaluation=AllocatingEvaluation(), jacB=DefaultOrthonormalBasis())See also
- BergmannHerzogSilvaLouzeiroTenbrinckVidalNunez2020
R. Bergmann, R. Herzog, M. Silva Louzeiro, D. Tenbrinck, J. Vidal-Núñez: Fenchel Duality Theory and a Primal-Dual Algorithm on Riemannian Manifolds, Foundations of Computational Mathematics, 2021. doi: 10.1007/s10208-020-09486-5 arXiv: 1908.02022
- ChambollePock2011
A. Chambolle, T. Pock: A first-order primal-dual algorithm for convex problems with applications to imaging, Journal of Mathematical Imaging and Vision 40(1), 120–145, 2011. doi: 10.1007/s10851-010-0251-1
- DiepeveenLellmann2021
W. Diepeveen, J. Lellmann: An Inexact Semismooth Newton Method on Riemannian Manifolds with Application to Duality-Based Total Variation Denoising, SIAM Journal on Imaging Sciences, 2021. doi: 10.1137/21M1398513