Subgradient method
Manopt.subgradient_method — Functionsubgradient_method(M, f, ∂f, p=rand(M); kwargs...)
subgradient_method(M, sgo, p=rand(M); kwargs...)perform a subgradient method $p_{k+1} = \mathrm{retr}(p_k, s_k∂f(p_k))$,
where $\mathrm{retr}$ is a retraction, $s_k$ is a step size, usually the ConstantStepsize but also be specified. Though the subgradient might be set valued, the argument ∂f should always return one element from the subgradient, but not necessarily deterministic. For more details see [FO98].
Input
M: a manifold $\mathcal M$f: a cost function $f:\mathcal M→ℝ$ to minimize∂f: the (sub)gradient $∂ f: \mathcal M→ T\mathcal M$ of f restricted to always only returning one value/element from the subdifferential. This function can be passed as an allocation function(M, p) -> Xor a mutating function(M, X, p) -> X, seeevaluation.p: (rand(M)) an initial value $p_0=p ∈ \mathcal M$
alternatively to f and ∂f a ManifoldSubgradientObjective sgo can be provided.
Optional
evaluation: (AllocatingEvaluation) specify whether the subgradient works by allocation (default) form∂f(M, y)orInplaceEvaluationin place of the form∂f!(M, X, x).retraction: (default_retraction_method(M, typeof(p))) a retraction to use.stepsize: (ConstantStepsize(M)) specify aStepsizestopping_criterion: (StopAfterIteration(5000)) a functor, seeStoppingCriterion, indicating when to stop.
and the ones that are passed to decorate_state! for decorators.
Output
the obtained (approximate) minimizer $p^*$, see get_solver_return for details
Manopt.subgradient_method! — Functionsubgradient_method!(M, f, ∂f, p)
subgradient_method!(M, sgo, p)perform a subgradient method $p_{k+1} = \mathrm{retr}(p_k, s_k∂f(p_k))$,
Input
M: a manifold $\mathcal M$f: a cost function $f:\mathcal M→ℝ$ to minimize∂f: the (sub)gradient $∂f: \mathcal M→ T\mathcal M$ of F restricted to always only returning one value/element from the subdifferential. This function can be passed as an allocation function(M, p) -> Xor a mutating function(M, X, p) -> X, seeevaluation.p: an initial value $p_0=p ∈ \mathcal M$
alternatively to f and ∂f a ManifoldSubgradientObjective sgo can be provided.
for more details and all optional parameters, see subgradient_method.
State
Manopt.SubGradientMethodState — TypeSubGradientMethodState <: AbstractManoptSolverStatestores option values for a subgradient_method solver
Fields
retraction_method: the retraction to use withinstepsize: (ConstantStepsize(M)) aStepsizestop: (StopAfterIteration(5000))a [StoppingCriterion`](@ref)p: (initial or current) value the algorithm is atp_star: optimal value (initialized to a copy ofp.)X: (zero_vector(M, p)) the current element from the possible subgradients atpthat was last evaluated.
Constructor
SubGradientMethodState(M::AbstractManifold, p; kwargs...)with keywords for all fields besides p_star which obtains the same type as p. You can use X= to specify the type of tangent vector to use
For DebugActions and RecordActions to record (sub)gradient, its norm and the step sizes, see the gradient descent actions.
Technical details
The subgradient_method solver requires the following functions of a manifold to be available
- A
retract!(M, q, p, X); it is recommended to set thedefault_retraction_methodto a favourite retraction. If this default is set, aretraction_method=does not have to be specified.
Literature
- [FO98]
- O. Ferreira and P. R. Oliveira. Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97, 93–104 (1998).