Gradient Descent
Manopt.stochastic_gradient_descent — Functionstochastic_gradient_descent(M, ∇F, x)perform a stochastic gradient descent
Input
Ma manifold $\mathcal M$∇F– a gradient function, that either returns a vector of the subgradients or is a vector of gradientsx– an initial value $x ∈ \mathcal M$
Optional
cost– (missing) you can provide a cost function for example to track the function valueevaluation_order– (:Random) – whether to use a randomly permuted sequence (:FixedRandom), a per cycle permuted sequence (:Linear) or the default:Randomone.stopping_criterion(StopAfterIteration(1000))– aStoppingCriterionstepsize(ConstantStepsize(1.0)) aStepsizeorder_type(:RandomOder) a type of ordering of gradient evaluations. values are:RandomOrder, a:FixedPermutation,:LinearOrderorder- ([1:n]) the initial permutation, wherenis the number of gradients in∇F.retraction_method– (ExponentialRetraction()) aretraction(M,x,ξ)to use.
Output
x_opt– the resulting (approximately critical) point of gradientDescent
OR
options- the options returned by the solver (seereturn_options)
Manopt.stochastic_gradient_descent! — Functionstochastic_gradient_descent!(M, ∇F, x)perform a stochastic gradient descent inplace of x.
Input
Ma manifold $\mathcal M$∇F– a gradient function, that either returns a vector of the subgradients or is a vector of gradientsx– an initial value $x ∈ \mathcal M$
for all optional parameters, see stochastic_gradient_descent.
Options
Manopt.StochasticGradientDescentOptions — TypeStochasticGradientDescentOptions <: AbstractStochasticGradientDescentOptionsStore the following fields for a default stochastic gradient descent algorithm, see also StochasticGradientProblem and stochastic_gradient_descent.
fields
Fields
xthe current iteratestopping_criterion(StopAfterIteration(1000))– aStoppingCriterionstepsize(ConstantStepsize(1.0)) aStepsizeevaluation_order– (:Random) – whether to use a randomly permuted sequence (:FixedRandom), a per cycle permuted sequence (:Linear) or the default:Randomone.orderthe current permutationretraction_method– (ExponentialRetraction()) aretraction(M,x,ξ)to use.
Constructor
StochasticGradientDescentOptions(x)Create a StochasticGradientDescentOptions with start point x. all other fields are optional keyword arguments.
Additionally, the options share a DirectionUpdateRule, so you can also apply MomentumGradient and AverageGradient here. The most inner one should always be.
Manopt.AbstractStochasticGradientProcessor — TypeAbstractStochasticGradientDescentOptions <: OptionsA generic type for all options related to stochastic gradient descent methods
Manopt.StochasticGradient — TypeStochasticGradient <: DirectionUpdateRuleThe default gradient processor, which just evaluates the (stochastic) gradient or a subset thereof.