Gradient Descent
Manopt.stochastic_gradient_descent
— Functionstochastic_gradient_descent(M, ∇F, x)
perform a stochastic gradient descent
Input
M
a manifold $\mathcal M$∇F
– a gradient function, that either returns a vector of the subgradients or is a vector of gradientsx
– an initial value $x ∈ \mathcal M$
Optional
cost
– (missing
) you can provide a cost function for example to track the function valueevaluation_order
– (:Random
) – whether to use a randomly permuted sequence (:FixedRandom
), a per cycle permuted sequence (:Linear
) or the default:Random
one.stopping_criterion
(StopAfterIteration
(1000)
)– aStoppingCriterion
stepsize
(ConstantStepsize
(1.0)
) aStepsize
order_type
(:RandomOder
) a type of ordering of gradient evaluations. values are:RandomOrder
, a:FixedPermutation
,:LinearOrder
order
- ([1:n]
) the initial permutation, wheren
is the number of gradients in∇F
.retraction_method
– (ExponentialRetraction()
) aretraction(M,x,ξ)
to use.
Output
x_opt
– the resulting (approximately critical) point of gradientDescent
OR
options
- the options returned by the solver (seereturn_options
)
Manopt.stochastic_gradient_descent!
— Functionstochastic_gradient_descent!(M, ∇F, x)
perform a stochastic gradient descent inplace of x
.
Input
M
a manifold $\mathcal M$∇F
– a gradient function, that either returns a vector of the subgradients or is a vector of gradientsx
– an initial value $x ∈ \mathcal M$
for all optional parameters, see stochastic_gradient_descent
.
Options
Manopt.StochasticGradientDescentOptions
— TypeStochasticGradientDescentOptions <: AbstractStochasticGradientDescentOptions
Store the following fields for a default stochastic gradient descent algorithm, see also StochasticGradientProblem
and stochastic_gradient_descent
.
fields
Fields
x
the current iteratestopping_criterion
(StopAfterIteration
(1000)
)– aStoppingCriterion
stepsize
(ConstantStepsize
(1.0)
) aStepsize
evaluation_order
– (:Random
) – whether to use a randomly permuted sequence (:FixedRandom
), a per cycle permuted sequence (:Linear
) or the default:Random
one.order
the current permutationretraction_method
– (ExponentialRetraction()
) aretraction(M,x,ξ)
to use.
Constructor
StochasticGradientDescentOptions(x)
Create a StochasticGradientDescentOptions
with start point x
. all other fields are optional keyword arguments.
Additionally, the options share a DirectionUpdateRule
, so you can also apply MomentumGradient
and AverageGradient
here. The most inner one should always be.
Manopt.AbstractStochasticGradientProcessor
— TypeAbstractStochasticGradientDescentOptions <: Options
A generic type for all options related to stochastic gradient descent methods
Manopt.StochasticGradient
— TypeStochasticGradient <: DirectionUpdateRule
The default gradient processor, which just evaluates the (stochastic) gradient or a subset thereof.