Gradients

For a function $f\colon\mathcal M\to\mathbb R$ the Riemannian gradient $\nabla f(x)$ at $x∈\mathcal M$ is given by the unique tangent vector fulfilling

\[\langle \nabla f(x), \xi\rangle_x = D_xf[\xi],\quad \forall \xi ∈ T_x\mathcal M,\]

where $D_xf[\xi]$ denotes the differential of $f$ at $x$ with respect to the tangent direction (vector) $\xi$ or in other words the directional derivative.

This page collects the available gradients.

Manopt.forward_logsMethod
ξ = forward_logs(M,x)

compute the forward logs $F$ (generalizing forward differences) orrucirng, in the power manifold array, the function

\[$F_i(x) = \sum_{j ∈ \mathcal I_i} \log_{x_i} x_j,\quad i ∈ \mathcal G,\]

where $\mathcal G$ is the set of indices of the PowerManifold manifold M and $\mathcal I_i$ denotes the forward neighbors of $i$.

Input

  • M – a PowerManifold manifold
  • x – a point.

Ouput

  • ξ – resulting tangent vector in $T_x\mathcal M$ representing the logs, where $\mathcal N$ is thw power manifold with the number of dimensions added to size(x).
source
Manopt.∇L2_acceleration_bezierMethod
∇L2_acceleration_bezier(
    M::Manifold,
    B::AbstractVector{P},
    degrees::AbstractVector{<:Integer},
    T::AbstractVector{<:AbstractFloat},
    λ::Float64,
    d::AbstractVector{P}
) where {P}

compute the gradient of the discretized acceleration of a composite Bézier curve on the Manifold M with respect to its control points B together with a data term that relates the junction points p_i to the data d with a weigth $\lambda$ comapared to the acceleration. The curve is evaluated at the points given in pts (elementwise in $[0,N]$), where $N$ is the number of segments of the Bézier curve. The summands are ∇distance for the data term and ∇acceleration_bezier for the acceleration with interpolation constrains. Here the get_bezier_junctions are included in the optimization, i.e. setting $λ=0$ yields the unconstrained acceleration minimization. Note that this is ill-posed, since any Bézier curve identical to a geodesic is a minimizer.

Note that the Beziér-curve is given in reduces form as a point on a PowerManifold, together with the degrees of the segments and assuming a differentiable curve, the segmenents can internally be reconstructed.

See also

∇acceleration_bezier, cost_L2_acceleration_bezier, cost_acceleration_bezier.

source
Manopt.∇TVFunction
ξ = ∇TV(M,λ,x,[p])

Compute the (sub)gradient $\partial F$ of all forward differences orrucirng, in the power manifold array, i.e. of the function

\[F(x) = \sum_{i}\sum_{j ∈ \mathcal I_i} d^p(x_i,x_j)\]

where $i$ runs over all indices of the PowerManifold manifold M and $\mathcal I_i$ denotes the forward neighbors of $i$.

Input

  • M – a PowerManifold manifold
  • x – a point.

Ouput

  • ξ – resulting tangent vector in $T_x\mathcal M$.
source
Manopt.∇TVMethod
∇TV(M,(x,y),[p=1])

compute the (sub) gradient of $\frac{1}{p}d^p_{\mathcal M}(x,y)$ with respect to both $x$ and $y$.

source
Manopt.∇TV2Function
∇TV2(M,q [,p=1])

computes the (sub) gradient of $\frac{1}{p}d_2^p(x_1,x_2,x_3)$ with respect to all $x_1,x_2,x_3$ occuring along any array dimension in the point x, where M is the corresponding PowerManifold.

source
Manopt.∇TV2Method
∇TV2(M,(x,y,z),p)

computes the (sub) gradient of $\frac{1}{p}d_2^p(x,y,z)$ with respect to $x$, $y$, and $z$, where $d_2$ denotes the second order absolute difference using the mid point model, i.e. let

\[ \mathcal C = \bigl\{ c ∈ \mathcal M \ |\ g(\tfrac{1}{2};x_1,x_3) \text{ for some geodesic }g\bigr\}\]

denote the mid points between $x$ and $z$ on the manifold $\mathcal M$. Then the absolute second order difference is defined as

\[d_2(x,y,z) = \min_{c ∈ \mathcal C_{x,z}} d(c,y).\]

While the (sub)gradient with respect to $y$ is easy, the other two require the evaluation of an adjoint_Jacobi_field. See Illustration of the Gradient of a Second Order Difference for its derivation.

source
Manopt.∇acceleration_bezierMethod
∇acceleration_bezier(
    M::Manifold,
    B::AbstractVector{P},
    degrees::AbstractVector{<:Integer}
    T::AbstractVector{<:AbstractFloat}
)

compute the gradient of the discretized acceleration of a (composite) Bézier curve $c_B(t)$ on the Manifold M with respect to its control points B given as a point on the PowerManifold assuming C1 conditions and known degrees. The curve is evaluated at the points given in T (elementwise in $[0,N]$, where $N$ is the number of segments of the Bézier curve). The get_bezier_junctions are fixed for this gradient (interpolation constraint). For the unconstrained gradient, see ∇L2_acceleration_bezier and set $λ=0$ therein. This gradient is computed using adjoint_Jacobi_fields. For details, see [BergmannGousenbourger2018]. See de_casteljau for more details on the curve.

See also

cost_acceleration_bezier, ∇L2_acceleration_bezier, cost_L2_acceleration_bezier.

source
Manopt.∇distanceFunction
∇distance(M,y,x[, p=2])

compute the (sub)gradient of the distance (squared)

\[f(x) = \frac{1}{2} d^p_{\mathcal M}(x,y)\]

to a fixed point y on the manifold M and p is an integer. The gradient reads

\[ \nabla f(x) = -d_{\mathcal M}^{p-2}(x,y)\log_xy\]

for $p\neq 1$ or $x\neq y$. Note that for the remaining case $p=1$, $x=y$ the function is not differentiable. In this case, the function returns the corresponding zero tangent vector, since this is an element of the subdifferential.

Optional

  • p – (2) the exponent of the distance, i.e. the default is the squared distance
source
Manopt.∇intrinsic_infimal_convolution_TV12Method
∇u,⁠∇v = ∇intrinsic_infimal_convolution_TV12(M,f,u,v,α,β)

compute (sub)gradient of the intrinsic infimal convolution model using the mid point model of second order differences, see costTV2, i.e. for some $f ∈ \mathcal M$ on a PowerManifold manifold $\mathcal M$ this function computes the (sub)gradient of

\[E(u,v) = \frac{1}{2}\sum_{i ∈ \mathcal G} d_{\mathcal M}(g(\frac{1}{2},v_i,w_i),f_i) + \alpha \bigl( \beta\mathrm{TV}(v) + (1-\beta)\mathrm{TV}_2(w) \bigr),\]

where both total variations refer to the intrinsic ones, ∇TV and ∇TV2, respectively.

source