Subgradient method

Manopt.subgradient_methodFunction
subgradient_method(M, f, ∂f, p=rand(M); kwargs...)
subgradient_method(M, sgo, p=rand(M); kwargs...)
subgradient_method!(M, f, ∂f, p; kwargs...)
subgradient_method!(M, sgo, p; kwargs...)

perform a subgradient method $p^{(k+1)} = \operatorname{retr}\bigl(p^{(k)}, s^{(k)}∂f(p^{(k)})\bigr)$, where $\operatorname{retr}$ is a retraction, $s^{(k)}$ is a step size.

Though the subgradient might be set valued, the argument ∂f should always return one element from the subgradient, but not necessarily deterministic. For more details see [FO98].

Input

  • M::AbstractManifold: a Riemannian manifold $\mathcal M$
  • f: a cost function $f: \mathcal M→ ℝ$ implemented as (M, p) -> v
  • ∂f: the (sub)gradient $∂ f: \mathcal M → T\mathcal M$ of f
  • p: a point on the manifold $\mathcal M$

alternatively to f and ∂f a ManifoldSubgradientObjective sgo can be provided.

Keyword arguments

  • evaluation=AllocatingEvaluation(): specify whether the functions that return an array, for example a point or a tangent vector, work by allocating its result (AllocatingEvaluation) or whether they modify their input argument to return the result therein (InplaceEvaluation). Since usually the first argument is the manifold, the modified argument is the second.
  • retraction_method=default_retraction_method(M, typeof(p)): a retraction $\operatorname{retr}$ to use, see the section on retractions
  • stepsize=default_stepsize(M, SubGradientMethodState): a functor inheriting from Stepsize to determine a step size
  • stopping_criterion=StopAfterIteration(5000): a functor indicating that the stopping criterion is fulfilled
  • X=zero_vector(M, p): a tangent vector at the point $p$ on the manifold $\mathcal M$to specify the representation of a tangent vector

and the ones that are passed to decorate_state! for decorators.

Output

the obtained (approximate) minimizer $p^*$, see get_solver_return for details

source
Manopt.subgradient_method!Function
subgradient_method(M, f, ∂f, p=rand(M); kwargs...)
subgradient_method(M, sgo, p=rand(M); kwargs...)
subgradient_method!(M, f, ∂f, p; kwargs...)
subgradient_method!(M, sgo, p; kwargs...)

perform a subgradient method $p^{(k+1)} = \operatorname{retr}\bigl(p^{(k)}, s^{(k)}∂f(p^{(k)})\bigr)$, where $\operatorname{retr}$ is a retraction, $s^{(k)}$ is a step size.

Though the subgradient might be set valued, the argument ∂f should always return one element from the subgradient, but not necessarily deterministic. For more details see [FO98].

Input

  • M::AbstractManifold: a Riemannian manifold $\mathcal M$
  • f: a cost function $f: \mathcal M→ ℝ$ implemented as (M, p) -> v
  • ∂f: the (sub)gradient $∂ f: \mathcal M → T\mathcal M$ of f
  • p: a point on the manifold $\mathcal M$

alternatively to f and ∂f a ManifoldSubgradientObjective sgo can be provided.

Keyword arguments

  • evaluation=AllocatingEvaluation(): specify whether the functions that return an array, for example a point or a tangent vector, work by allocating its result (AllocatingEvaluation) or whether they modify their input argument to return the result therein (InplaceEvaluation). Since usually the first argument is the manifold, the modified argument is the second.
  • retraction_method=default_retraction_method(M, typeof(p)): a retraction $\operatorname{retr}$ to use, see the section on retractions
  • stepsize=default_stepsize(M, SubGradientMethodState): a functor inheriting from Stepsize to determine a step size
  • stopping_criterion=StopAfterIteration(5000): a functor indicating that the stopping criterion is fulfilled
  • X=zero_vector(M, p): a tangent vector at the point $p$ on the manifold $\mathcal M$to specify the representation of a tangent vector

and the ones that are passed to decorate_state! for decorators.

Output

the obtained (approximate) minimizer $p^*$, see get_solver_return for details

source

State

Manopt.SubGradientMethodStateType
SubGradientMethodState <: AbstractManoptSolverState

stores option values for a subgradient_method solver

Fields

  • p::P: a point on the manifold $\mathcal M$storing the current iterate
  • p_star: optimal value
  • retraction_method::AbstractRetractionMethod: a retraction $\operatorname{retr}$ to use, see the section on retractions
  • stepsize::Stepsize: a functor inheriting from Stepsize to determine a step size
  • stop::StoppingCriterion: a functor indicating that the stopping criterion is fulfilled
  • X: the current element from the possible subgradients at p that was last evaluated.

Constructor

SubGradientMethodState(M::AbstractManifold; kwargs...)

Initialise the Subgradient method state

Keyword arguments

  • retraction_method=default_retraction_method(M, typeof(p)): a retraction $\operatorname{retr}$ to use, see the section on retractions
  • p=rand(M): a point on the manifold $\mathcal M$to specify the initial value
  • stepsize=default_stepsize(M, SubGradientMethodState): a functor inheriting from Stepsize to determine a step size
  • stopping_criterion=StopAfterIteration(5000): a functor indicating that the stopping criterion is fulfilled
  • X=zero_vector(M, p): a tangent vector at the point $p$ on the manifold $\mathcal M$to specify the representation of a tangent vector
source

For DebugActions and RecordActions to record (sub)gradient, its norm and the step sizes, see the gradient descent actions.

Technical details

The subgradient_method solver requires the following functions of a manifold to be available

  • A retract!(M, q, p, X); it is recommended to set the default_retraction_method to a favourite retraction. If this default is set, a retraction_method= does not have to be specified.

Literature

[FO98]
O. Ferreira and P. R. Oliveira. Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97, 93–104 (1998).