perform a subgradient method $x_{k+1} = \mathrm{retr}(x_k, s_k∂F(x_k))$,

where $\mathrm{retr}$ is a retraction, $s_k$ can be specified as a function but is usually set to a constant value. Though the subgradient might be set valued, the argument ∂F should always return one element from the subgradient, but not necessarily deterministic.

Input

• M – a manifold $\mathcal M$
• F – a cost function $F:\mathcal M→ℝ$ to minimize
• ∂F– the (sub)gradient $\partial F: \mathcal M→ T\mathcal M$ of F restricted to always only returning one value/element from the subgradient. This function can be passed as an allocation function (M, y) -> X or a mutating function (M, X, y) -> X, see evaluation.
• x – an initial value $x ∈ \mathcal M$

Optional

• evaluation – (AllocatingEvaluation) specify whether the subgradient works by allocation (default) form ∂F(M, y) or MutatingEvaluation in place, i.e. is of the form ∂F!(M, X, x).
• stepsize – (ConstantStepsize(1.)) specify a Stepsize
• retraction – (default_retraction_method(M)) a retraction(M,x,ξ) to use.
• stopping_criterion – (StopAfterIteration(5000)) a functor, seeStoppingCriterion, indicating when to stop.
• return_options – (false) – if activated, the extended result, i.e. the complete Options re returned. This can be used to access recorded values. If set to false (default) just the optimal value x_opt if returned

... and the ones that are passed to decorate_options for decorators.

Output

• x_opt – the resulting (approximately critical) point of the subgradient method

OR

• options - the options returned by the solver (see return_options)
source

perform a subgradient method $x_{k+1} = \mathrm{retr}(x_k, s_k∂F(x_k))$ in place of x

Input

• M – a manifold $\mathcal M$
• F – a cost function $F:\mathcal M→ℝ$ to minimize
• ∂F- the (sub)gradient $\partial F:\mathcal M→ T\mathcal M$ of F restricted to always only returning one value/element from the subgradient. This function can be passed as an allocation function (M, y) -> X or a mutating function (M, X, y) -> X, see evaluation.
• x – an initial value $x ∈ \mathcal M$

for more details and all optional parameters, see subgradient_method.

source

Options

For DebugActions and RecordActions to record (sub)gradient, its norm and the step sizes, see the steepest Descent actions.