Subgradient method
Manopt.subgradient_method
— Functionsubgradient_method(M, f, ∂f, p=rand(M); kwargs...)
subgradient_method(M, sgo, p=rand(M); kwargs...)
subgradient_method!(M, f, ∂f, p; kwargs...)
subgradient_method!(M, sgo, p; kwargs...)
perform a subgradient method $p^{(k+1)} = \operatorname{retr}\bigl(p^{(k)}, s^{(k)}∂f(p^{(k)})\bigr)$, where $\operatorname{retr}$ is a retraction, $s^{(k)}$ is a step size.
Though the subgradient might be set valued, the argument ∂f
should always return one element from the subgradient, but not necessarily deterministic. For more details see [FO98].
Input
M::
AbstractManifold
: a Riemannian manifold $\mathcal M$
f
: a cost function $f: \mathcal M→ ℝ$ implemented as(M, p) -> v
∂f
: the (sub)gradient $∂ f: \mathcal M → T\mathcal M$ of fp
: a point on the manifold $\mathcal M$
alternatively to f
and ∂f
a ManifoldSubgradientObjective
sgo
can be provided.
Keyword arguments
evaluation=
AllocatingEvaluation
()
: specify whether the functions that return an array, for example a point or a tangent vector, work by allocating its result (AllocatingEvaluation
) or whether they modify their input argument to return the result therein (InplaceEvaluation
). Since usually the first argument is the manifold, the modified argument is the second.retraction_method=
default_retraction_method
(M, typeof(p))
: a retraction $\operatorname{retr}$ to use, see the section on retractionsstepsize=
default_stepsize
(M, SubGradientMethodState)
: a functor inheriting fromStepsize
to determine a step sizestopping_criterion=
StopAfterIteration
(5000)
: a functor indicating that the stopping criterion is fulfilledX=
zero_vector
(M, p)
: a tangent vector at the point $p$ on the manifold $\mathcal M$to specify the representation of a tangent vector
and the ones that are passed to decorate_state!
for decorators.
Output
the obtained (approximate) minimizer $p^*$, see get_solver_return
for details
Manopt.subgradient_method!
— Functionsubgradient_method(M, f, ∂f, p=rand(M); kwargs...)
subgradient_method(M, sgo, p=rand(M); kwargs...)
subgradient_method!(M, f, ∂f, p; kwargs...)
subgradient_method!(M, sgo, p; kwargs...)
perform a subgradient method $p^{(k+1)} = \operatorname{retr}\bigl(p^{(k)}, s^{(k)}∂f(p^{(k)})\bigr)$, where $\operatorname{retr}$ is a retraction, $s^{(k)}$ is a step size.
Though the subgradient might be set valued, the argument ∂f
should always return one element from the subgradient, but not necessarily deterministic. For more details see [FO98].
Input
M::
AbstractManifold
: a Riemannian manifold $\mathcal M$
f
: a cost function $f: \mathcal M→ ℝ$ implemented as(M, p) -> v
∂f
: the (sub)gradient $∂ f: \mathcal M → T\mathcal M$ of fp
: a point on the manifold $\mathcal M$
alternatively to f
and ∂f
a ManifoldSubgradientObjective
sgo
can be provided.
Keyword arguments
evaluation=
AllocatingEvaluation
()
: specify whether the functions that return an array, for example a point or a tangent vector, work by allocating its result (AllocatingEvaluation
) or whether they modify their input argument to return the result therein (InplaceEvaluation
). Since usually the first argument is the manifold, the modified argument is the second.retraction_method=
default_retraction_method
(M, typeof(p))
: a retraction $\operatorname{retr}$ to use, see the section on retractionsstepsize=
default_stepsize
(M, SubGradientMethodState)
: a functor inheriting fromStepsize
to determine a step sizestopping_criterion=
StopAfterIteration
(5000)
: a functor indicating that the stopping criterion is fulfilledX=
zero_vector
(M, p)
: a tangent vector at the point $p$ on the manifold $\mathcal M$to specify the representation of a tangent vector
and the ones that are passed to decorate_state!
for decorators.
Output
the obtained (approximate) minimizer $p^*$, see get_solver_return
for details
State
Manopt.SubGradientMethodState
— TypeSubGradientMethodState <: AbstractManoptSolverState
stores option values for a subgradient_method
solver
Fields
p::P
: a point on the manifold $\mathcal M$storing the current iteratep_star
: optimal valueretraction_method::AbstractRetractionMethod
: a retraction $\operatorname{retr}$ to use, see the section on retractionsstepsize::Stepsize
: a functor inheriting fromStepsize
to determine a step sizestop::StoppingCriterion
: a functor indicating that the stopping criterion is fulfilledX
: the current element from the possible subgradients atp
that was last evaluated.
Constructor
SubGradientMethodState(M::AbstractManifold; kwargs...)
Initialise the Subgradient method state
Keyword arguments
retraction_method=
default_retraction_method
(M, typeof(p))
: a retraction $\operatorname{retr}$ to use, see the section on retractionsp=
rand
(M)
: a point on the manifold $\mathcal M$to specify the initial valuestepsize=
default_stepsize
(M, SubGradientMethodState)
: a functor inheriting fromStepsize
to determine a step sizestopping_criterion=
StopAfterIteration
(5000)
: a functor indicating that the stopping criterion is fulfilledX=
zero_vector
(M, p)
: a tangent vector at the point $p$ on the manifold $\mathcal M$to specify the representation of a tangent vector
For DebugAction
s and RecordAction
s to record (sub)gradient, its norm and the step sizes, see the gradient descent actions.
Technical details
The subgradient_method
solver requires the following functions of a manifold to be available
- A
retract!
(M, q, p, X)
; it is recommended to set thedefault_retraction_method
to a favourite retraction. If this default is set, aretraction_method=
does not have to be specified.
Literature
- [FO98]
- O. Ferreira and P. R. Oliveira. Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97, 93–104 (1998).