Subgradient Method
Manopt.subGradientMethod — Function.subGradientMethod(M, F, ∂F, x)perform a subgradient method $x_{k+1} = \mathrm{retr}(x_k, s_k∂F(x_k))$,
where $\mathrm{retr}$ is a retraction, $s_k$ can be specified as a function but is usually set to a constant value. Though the subgradient might be set valued, the argument ∂F should always return one element from the subgradient.
Input
M– a manifold $\mathcal M$F– a cost function $F\colon\mathcal M\to\mathbb R$ to minimize∂F: the (sub)gradient $\partial F\colon\mathcal M\to T\mathcal M$ of F restricted to always only returning one value/element from the subgradientx– an initial value $x\in\mathcal M$
Optional
stepsize– (ConstantStepsize(1.)) specify aStepsizeretraction– (exp) aretraction(M,x,ξ)to use.stoppingCriterion– (stopWhenAny(stopAfterIteration(200),stopWhenGradientNormLess(10.0^-8))) a functor, seeStoppingCriterion, indicating when to stop.returnOptions– (false) – if actiavated, the extended result, i.e. the completeOptionsre returned. This can be used to access recorded values. If set to false (default) just the optimal valuexOptif returned
... and the ones that are passed to decorateOptions for decorators.
Output
xOpt– the resulting (approximately critical) point of gradientDescent
OR
options- the options returned by the solver (seereturnOptions)
Options
Manopt.SubGradientMethodOptions — Type.SubGradientMethodOptions <: Optionsstories option values for a subGradientMethod solver
Fields
retraction– the retration to use withinstepsize– aStepsizestop– aStoppingCriterionx– (initial or current) value the algorithm is atoptimalX– optimal value
For DebugActions and RecordActions to record (sub)gradient, its norm and the stepsizes, see the steepest Descent actions.