Gradients
For a function $f\colon\mathcal M\to\mathbb R$ the Riemannian gradient $\nabla f(x)$ at $x\in\mathcal M$ is given by the unique tangent vector fulfilling
where $D_xf[\xi]$ denotes the differential of $f$ at $x$ with respect to the tangent direction (vector) $\xi$ or in other words the directional derivative.
This page collects the available gradients.
Manopt.forwardLogs
— Method.ξ = forwardLogs(M,x)
compute the forward logs $F$ (generalizing forward differences) orrucirng, in the power manifold array, the function
where $\mathcal G$ is the set of indices of the Power
manifold M
and $\mathcal I_i$ denotes the forward neighbors of $i$.
Input
Ouput
ξ
– resulting tangent vector in $T_x\mathcal M$ representing the logs, where $\mathcal N$ is thw power manifold with the number of dimensions added tosize(x)
.
Manopt.gradDistance
— Function.gradDistance(M,y,x[, p=2])
compute the (sub)gradient of the distance (squared)
to a fixed MPoint
y
on the Manifold
M
and p
is an integer. The gradient reads
for $p\neq 1$ or $x\neq y$. Note that for the remaining case $p=1$, $x=y$ the function is not differentiable. This function returns then the zeroTVector
(M,x)
, since this is an element of the subdifferential.
Optional
p
– (2
) the exponent of the distance, i.e. the default is the squared distance
Manopt.gradIntrICTV12
— Method.∇u,∇v = gradIntrICTV12(M,f,u,v,α,β)
compute (sub)gradient of the intrinsic infimal convolution model using the mid point model of second order differences, see costTV2
, i.e. for some $f\in\mathcal M$ on a Power
manifold $\mathcal M$ this function computes the (sub)gradient of
where both total variations refer to the intrinsic ones, gradTV
and gradTV2
, respectively.
Manopt.gradTV
— Function.gradTV(M,(x,y),[p=1])
compute the (sub) gradient of $\frac{1}{p}d^p_{\mathcal M}(x,y)$ with respect to both $x$ and $y$.
Manopt.gradTV
— Function.ξ = gradTV(M,λ,x,[p])
Compute the (sub)gradient $\partial F$ of all forward differences orrucirng, in the power manifold array, i.e. of the function
where $i$ runs over all indices of the Power
manifold M
and $\mathcal I_i$ denotes the forward neighbors of $i$.
Input
Ouput
- ξ – resulting tangent vector in $T_x\mathcal M$.
Manopt.gradTV2
— Function.Manopt.gradTV2
— Function.gradTV2(M,(x,y,z),p)
computes the (sub) gradient of $\frac{1}{p}d_2^p(x,y,z)$ with respect to $x$, $y$, and $z$, where $d_2$ denotes the second order absolute difference using the mid point model, i.e. let
denote the mid points between $x$ and $z$ on the manifold $\mathcal M$. Then the absolute second order difference is defined as
While the (sub)gradient with respect to $y$ is easy, the other two require the evaluation of an adjointJacobiField
. See Illustration of the Gradient of a Second Order Difference for its derivation.