Subgradient Method

Manopt.subgradient_methodFunction
subgradient_method(M, F, ∂F, x)

perform a subgradient method $x_{k+1} = \mathrm{retr}(x_k, s_k∂F(x_k))$,

where $\mathrm{retr}$ is a retraction, $s_k$ can be specified as a function but is usually set to a constant value. Though the subgradient might be set valued, the argument ∂F should always return one element from the subgradient, but not necessarily determistic.

Input

  • M – a manifold $\mathcal M$
  • F – a cost function $F\colon\mathcal M\to\mathbb R$ to minimize
  • ∂F: the (sub)gradient $\partial F\colon\mathcal M\to T\mathcal M$ of F restricted to always only returning one value/element from the subgradient
  • x – an initial value $x ∈ \mathcal M$

Optional

  • stepsize – (ConstantStepsize(1.)) specify a Stepsize
  • retraction – (exp) a retraction(M,x,ξ) to use.
  • stopping_criterion – (StopAfterIteration(5000)) a functor, seeStoppingCriterion, indicating when to stop.
  • return_options – (false) – if activated, the extended result, i.e. the complete Options re returned. This can be used to access recorded values. If set to false (default) just the optimal value x_opt if returned

... and the ones that are passed to decorate_options for decorators.

Output

  • x_opt – the resulting (approximately critical) point of the subgradient method

OR

  • options - the options returned by the solver (see return_options)
source

Options

For DebugActions and RecordActions to record (sub)gradient, its norm and the step sizes, see the steepest Descent actions.