Subgradient Method
Manopt.subGradientMethod
— Function.subGradientMethod(M, F, ∂F, x)
perform a subgradient method $x_{k+1} = \mathrm{retr}(x_k, s_k∂F(x_k))$,
where $\mathrm{retr}$ is a retraction, $s_k$ can be specified as a function but is usually set to a constant value. Though the subgradient might be set valued, the argument ∂F
should always return one element from the subgradient.
Input
M
– a manifold $\mathcal M$F
– a cost function $F\colon\mathcal M\to\mathbb R$ to minimize∂F
: the (sub)gradient $\partial F\colon\mathcal M\to T\mathcal M$ of F restricted to always only returning one value/element from the subgradientx
– an initial value $x\in\mathcal M$
Optional
stepsize
– (ConstantStepsize
(1.)
) specify aStepsize
retraction
– (exp
) aretraction(M,x,ξ)
to use.stoppingCriterion
– (stopWhenAny
(
stopAfterIteration
(200),
stopWhenGradientNormLess
(10.0^-8))
) a functor, seeStoppingCriterion
, indicating when to stop.returnOptions
– (false
) – if actiavated, the extended result, i.e. the completeOptions
re returned. This can be used to access recorded values. If set to false (default) just the optimal valuexOpt
if returned
... and the ones that are passed to decorateOptions
for decorators.
Output
xOpt
– the resulting (approximately critical) point of gradientDescent
OR
options
- the options returned by the solver (seereturnOptions
)
Options
Manopt.SubGradientMethodOptions
— Type.SubGradientMethodOptions <: Options
stories option values for a subGradientMethod
solver
Fields
retraction
– the retration to use withinstepsize
– aStepsize
stop
– aStoppingCriterion
x
– (initial or current) value the algorithm is atoptimalX
– optimal value
For DebugAction
s and RecordAction
s to record (sub)gradient, its norm and the stepsizes, see the steepest Descent actions.