Subgradient Method
Manopt.subgradient_method — Functionsubgradient_method(M, F, ∂F, x)perform a subgradient method $x_{k+1} = \mathrm{retr}(x_k, s_k∂F(x_k))$,
where $\mathrm{retr}$ is a retraction, $s_k$ can be specified as a function but is usually set to a constant value. Though the subgradient might be set valued, the argument ∂F should always return one element from the subgradient, but not necessarily determistic.
Input
M– a manifold $\mathcal M$F– a cost function $F\colon\mathcal M\to\mathbb R$ to minimize∂F: the (sub)gradient $\partial F\colon\mathcal M\to T\mathcal M$ of F restricted to always only returning one value/element from the subgradientx– an initial value $x ∈ \mathcal M$
Optional
stepsize– (ConstantStepsize(1.)) specify aStepsizeretraction– (exp) aretraction(M,x,ξ)to use.stopping_criterion– (StopAfterIteration(5000)) a functor, seeStoppingCriterion, indicating when to stop.return_options– (false) – if activated, the extended result, i.e. the completeOptionsre returned. This can be used to access recorded values. If set to false (default) just the optimal valuex_optif returned
... and the ones that are passed to decorate_options for decorators.
Output
x_opt– the resulting (approximately critical) point of the subgradient method
OR
options- the options returned by the solver (seereturn_options)
Manopt.subgradient_method! — Functionsubgradient_method!(M, F, ∂F, x)perform a subgradient method $x_{k+1} = \mathrm{retr}(x_k, s_k∂F(x_k))$ in place of x
Input
M– a manifold $\mathcal M$F– a cost function $F\colon\mathcal M\to\mathbb R$ to minimize∂F: the (sub)gradient $\partial F\colon\mathcal M\to T\mathcal M$ of F restricted to always only returning one value/element from the subgradientx– an initial value $x ∈ \mathcal M$
for more details and all optional parameters, see subgradient_method.
Options
Manopt.SubGradientMethodOptions — TypeSubGradientMethodOptions <: Optionsstories option values for a subgradient_method solver
Fields
retraction_method– the retration to use withinstepsize– aStepsizestop– aStoppingCriterionx– (initial or current) value the algorithm is atx_optimal– optimal value∂the current element from the possivle subgradients atxthat is used
For DebugActions and RecordActions to record (sub)gradient, its norm and the step sizes, see the steepest Descent actions.