Alternating Gradient Descent

Manopt.alternating_gradient_descentFunction
alternating_gradient_descent(M, F, gradF, x)

perform an alternating gradient descent

Input

  • M – the product manifold $\mathcal M = \mathcal M_1 × \mathcal M_2 × ⋯ ×\mathcal M_n$
  • F – the objective function (cost) defined on M.
  • gradF – a gradient, that can be of two cases
    • is a single function returning a ProductRepr or
    • is a vector functions each returning a component part of the whole gradient
  • x – an initial value $x ∈ \mathcal M$

Optional

  • evaluation – (AllocatingEvaluation) specify whether the gradient(s) works by allocation (default) form gradF(M, x) or MutatingEvaluation in place, i.e. is of the form gradF!(M, X, x) (elementwise).
  • evaluation_order – (:Linear) – whether to use a randomly permuted sequence (:FixedRandom), a per cycle permuted sequence (:Random) or the default :Linear one.
  • inner_iterations– (5) how many gradient steps to take in a component before alternating to the next
  • stopping_criterion (StopAfterIteration(1000))– a StoppingCriterion
  • stepsize (ArmijoLinesearch()) a Stepsize
  • order - ([1:n]) the initial permutation, where n is the number of gradients in gradF.
  • retraction_method – (default_retraction_method(M)) a retraction(M, p, X) to use.

Output

usually the obtained (approximate) minimizer, see get_solver_return for details

Note

This Problem requires the ProductManifold from Manifolds.jl, so Manifolds.jl to be loaded.

Note

The input of each of the (component) gradients is still the whole vector x, just that all other then the ith input component are assumed to be fixed and just the ith components gradient is computed / returned.

source
Manopt.alternating_gradient_descent!Function
alternating_gradient_descent!(M, F, gradF, x)

perform a alternating gradient descent in place of x.

Input

  • M a manifold $\mathcal M$
  • F – the objective functioN (cost)
  • gradF – a gradient function, that either returns a vector of the subgradients or is a vector of gradients
  • x – an initial value $x ∈ \mathcal M$

for all optional parameters, see alternating_gradient_descent.

source

Problem

Manopt.AlternatingGradientProblemType
AlternatingGradientProblem <: Problem

An alternating gradient problem consists of

  • a ProductManifold M $=\mathcal M = \mathcal M_1 × ⋯ × M_n$
  • a cost function $F(x)$
  • a gradient $\operatorname{grad}F$ that is either
    • given as one function $\operatorname{grad}F$ returning a tangent vector X on M or
    • an array of gradient functions $\operatorname{grad}F_i$, ì=1,…,n s each returning a component of the gradient
    which might be allocating or mutating variants, but not a mix of both.
Note

This Problem requires the ProductManifold from Manifolds.jl, so Manifolds.jl to be loaded.

Note

The input of each of the (component) gradients is still the whole vector x, just that all other then the ith input component are assumed to be fixed and just the ith components gradient is computed / returned.

Constructors

AlternatingGradientProblem(M::ProductManifold, F, gradF::Function;
    evaluation=AllocatingEvaluation()
)
AlternatingGradientProblem(M::ProductManifold, F, gradF::AbstractVector{<:Function};
    evaluation=AllocatingEvaluation()
)

Create a alternating gradient problem with an optional cost and the gradient either as one function (returning an array) or a vector of functions.

source

Options

Manopt.AlternatingGradientDescentOptionsType
AlternatingGradientDescentOptions <: AbstractGradientDescentOptions

Store the fields for an alternating gradient descent algorithm, see also AlternatingGradientProblem and alternating_gradient_descent.

Fields

  • direction (AlternatingGradient(zero_vector(M, x)) a DirectionUpdateRule
  • evaluation_order – (:Linear) – whether
  • inner_iterations– (5) how many gradient steps to take in a component before alternating to the next to use a randomly permuted sequence (:FixedRandom), a per cycle newly permuted sequence (:Random) or the default :Linear evaluation order.
  • order the current permutation
  • retraction_method – (default_retraction_method(M)) a retraction(M,x,ξ) to use.
  • stepsize (ConstantStepsize(M)) a Stepsize
  • stopping_criterion (StopAfterIteration(1000))– a StoppingCriterion
  • x the current iterate
  • k, ì` internal counters for the outer and inner iterations, respectively.

Constructors

AlternatingGradientDescentOptions(M, x; kwargs...)

Generate the options for point x and and where the keyword arguments inner_iterations, order_type, order, retraction_method, stopping_criterion, and stepsize` are keyword arguments

source

Additionally, the options share a DirectionUpdateRule, which chooses the current component, so they can be decorated further; The most inner one should always be the following one though.

Manopt.AlternatingGradientType
AlternatingGradient <: DirectionUpdateRule

The default gradient processor, which just evaluates the (alternating) gradient on one of the components

source