Subgradient Method
Manopt.subgradient_method
— Functionsubgradient_method(M, F, ∂F, x)
perform a subgradient method $x_{k+1} = \mathrm{retr}(x_k, s_k∂F(x_k))$,
where $\mathrm{retr}$ is a retraction, $s_k$ can be specified as a function but is usually set to a constant value. Though the subgradient might be set valued, the argument ∂F
should always return one element from the subgradient, but not necessarily determistic.
Input
M
– a manifold $\mathcal M$F
– a cost function $F\colon\mathcal M\to\mathbb R$ to minimize∂F
: the (sub)gradient $\partial F\colon\mathcal M\to T\mathcal M$ of F restricted to always only returning one value/element from the subgradientx
– an initial value $x ∈ \mathcal M$
Optional
stepsize
– (ConstantStepsize
(1.)
) specify aStepsize
retraction
– (exp
) aretraction(M,x,ξ)
to use.stopping_criterion
– (StopAfterIteration
(5000)
) a functor, seeStoppingCriterion
, indicating when to stop.return_options
– (false
) – if activated, the extended result, i.e. the completeOptions
re returned. This can be used to access recorded values. If set to false (default) just the optimal valuex_opt
if returned
... and the ones that are passed to decorate_options
for decorators.
Output
x_opt
– the resulting (approximately critical) point of the subgradient method
OR
options
- the options returned by the solver (seereturn_options
)
Manopt.subgradient_method!
— Functionsubgradient_method!(M, F, ∂F, x)
perform a subgradient method $x_{k+1} = \mathrm{retr}(x_k, s_k∂F(x_k))$ in place of x
Input
M
– a manifold $\mathcal M$F
– a cost function $F\colon\mathcal M\to\mathbb R$ to minimize∂F
: the (sub)gradient $\partial F\colon\mathcal M\to T\mathcal M$ of F restricted to always only returning one value/element from the subgradientx
– an initial value $x ∈ \mathcal M$
for more details and all optional parameters, see subgradient_method
.
Options
Manopt.SubGradientMethodOptions
— TypeSubGradientMethodOptions <: Options
stories option values for a subgradient_method
solver
Fields
retraction_method
– the retration to use withinstepsize
– aStepsize
stop
– aStoppingCriterion
x
– (initial or current) value the algorithm is atx_optimal
– optimal value∂
the current element from the possivle subgradients atx
that is used
For DebugAction
s and RecordAction
s to record (sub)gradient, its norm and the step sizes, see the steepest Descent actions.