Alternating Gradient Descent
Manopt.alternating_gradient_descent — Functionalternating_gradient_descent(M, F, gradF, x)perform an alternating gradient descent
Input
M– the product manifold $\mathcal M = \mathcal M_1 × \mathcal M_2 × ⋯ ×\mathcal M_n$F– the objective function (cost) defined onM.gradF– a gradient, that can be of two cases- is a single function returning a
ProductRepror - is a vector functions each returning a component part of the whole gradient
- is a single function returning a
x– an initial value $x ∈ \mathcal M$
Optional
evaluation– (AllocatingEvaluation) specify whether the gradient(s) works by allocation (default) formgradF(M, x)orMutatingEvaluationin place, i.e. is of the formgradF!(M, X, x)(elementwise).evaluation_order– (:Linear) – whether to use a randomly permuted sequence (:FixedRandom), a per cycle permuted sequence (:Random) or the default:Linearone.inner_iterations– (5) how many gradient steps to take in a component before alternating to the nextstopping_criterion(StopAfterIteration(1000))– aStoppingCriterionstepsize(ArmijoLinesearch()) aStepsizeorder- ([1:n]) the initial permutation, wherenis the number of gradients ingradF.retraction_method– (default_retraction_method(M)) aretraction(M, p, X)to use.
Output
usually the obtained (approximate) minimizer, see get_solver_return for details
This Problem requires the ProductManifold from Manifolds.jl, so Manifolds.jl to be loaded.
The input of each of the (component) gradients is still the whole vector x, just that all other then the ith input component are assumed to be fixed and just the ith components gradient is computed / returned.
Manopt.alternating_gradient_descent! — Functionalternating_gradient_descent!(M, F, gradF, x)perform a alternating gradient descent in place of x.
Input
Ma manifold $\mathcal M$F– the objective functioN (cost)gradF– a gradient function, that either returns a vector of the subgradients or is a vector of gradientsx– an initial value $x ∈ \mathcal M$
for all optional parameters, see alternating_gradient_descent.
Problem
Manopt.AlternatingGradientProblem — TypeAlternatingGradientProblem <: ProblemAn alternating gradient problem consists of
- a
ProductManifold M$=\mathcal M = \mathcal M_1 × ⋯ × M_n$ - a cost function $F(x)$
- a gradient $\operatorname{grad}F$ that is either
- given as one function $\operatorname{grad}F$ returning a tangent vector
XonMor - an array of gradient functions $\operatorname{grad}F_i$,
ì=1,…,ns each returning a component of the gradient
- given as one function $\operatorname{grad}F$ returning a tangent vector
This Problem requires the ProductManifold from Manifolds.jl, so Manifolds.jl to be loaded.
The input of each of the (component) gradients is still the whole vector x, just that all other then the ith input component are assumed to be fixed and just the ith components gradient is computed / returned.
Constructors
AlternatingGradientProblem(M::ProductManifold, F, gradF::Function;
evaluation=AllocatingEvaluation()
)
AlternatingGradientProblem(M::ProductManifold, F, gradF::AbstractVector{<:Function};
evaluation=AllocatingEvaluation()
)Create a alternating gradient problem with an optional cost and the gradient either as one function (returning an array) or a vector of functions.
Options
Manopt.AlternatingGradientDescentOptions — TypeAlternatingGradientDescentOptions <: AbstractGradientDescentOptionsStore the fields for an alternating gradient descent algorithm, see also AlternatingGradientProblem and alternating_gradient_descent.
Fields
direction(AlternatingGradient(zero_vector(M, x))aDirectionUpdateRuleevaluation_order– (:Linear) – whetherinner_iterations– (5) how many gradient steps to take in a component before alternating to the next to use a randomly permuted sequence (:FixedRandom), a per cycle newly permuted sequence (:Random) or the default:Linearevaluation order.orderthe current permutationretraction_method– (default_retraction_method(M)) aretraction(M,x,ξ)to use.stepsize(ConstantStepsize(M)) aStepsizestopping_criterion(StopAfterIteration(1000))– aStoppingCriterionxthe current iteratek, ì` internal counters for the outer and inner iterations, respectively.
Constructors
AlternatingGradientDescentOptions(M, x; kwargs...)Generate the options for point x and and where the keyword arguments inner_iterations, order_type, order, retraction_method, stopping_criterion, and stepsize` are keyword arguments
Additionally, the options share a DirectionUpdateRule, which chooses the current component, so they can be decorated further; The most inner one should always be the following one though.
Manopt.AlternatingGradient — TypeAlternatingGradient <: DirectionUpdateRuleThe default gradient processor, which just evaluates the (alternating) gradient on one of the components