Gradients
For a function $f\colon\mathcal M\to\mathbb R$ the Riemannian gradient $\nabla f(x)$ at $x∈\mathcal M$ is given by the unique tangent vector fulfilling
where $D_xf[\xi]$ denotes the differential of $f$ at $x$ with respect to the tangent direction (vector) $\xi$ or in other words the directional derivative.
This page collects the available gradients.
Manopt.forward_logs
— Methodξ = forward_logs(M,x)
compute the forward logs $F$ (generalizing forward differences) orrucirng, in the power manifold array, the function
where $\mathcal G$ is the set of indices of the PowerManifold
manifold M
and $\mathcal I_i$ denotes the forward neighbors of $i$.
Input
M
– aPowerManifold
manifoldx
– a point.
Ouput
ξ
– resulting tangent vector in $T_x\mathcal M$ representing the logs, where $\mathcal N$ is thw power manifold with the number of dimensions added tosize(x)
.
Manopt.∇L2_acceleration_bezier
— Method∇L2_acceleration_bezier(
M::Manifold,
B::AbstractVector{P},
degrees::AbstractVector{<:Integer},
T::AbstractVector{<:AbstractFloat},
λ::Float64,
d::AbstractVector{P}
) where {P}
compute the gradient of the discretized acceleration of a composite Bézier curve on the Manifold
M
with respect to its control points B
together with a data term that relates the junction points p_i
to the data d
with a weigth $\lambda$ comapared to the acceleration. The curve is evaluated at the points given in pts
(elementwise in $[0,N]$), where $N$ is the number of segments of the Bézier curve. The summands are ∇distance
for the data term and ∇acceleration_bezier
for the acceleration with interpolation constrains. Here the get_bezier_junctions
are included in the optimization, i.e. setting $λ=0$ yields the unconstrained acceleration minimization. Note that this is ill-posed, since any Bézier curve identical to a geodesic is a minimizer.
Note that the Beziér-curve is given in reduces form as a point on a PowerManifold
, together with the degrees
of the segments and assuming a differentiable curve, the segmenents can internally be reconstructed.
See also
∇acceleration_bezier
, cost_L2_acceleration_bezier
, cost_acceleration_bezier
.
Manopt.∇TV
— Functionξ = ∇TV(M,λ,x,[p])
Compute the (sub)gradient $\partial F$ of all forward differences orrucirng, in the power manifold array, i.e. of the function
where $i$ runs over all indices of the PowerManifold
manifold M
and $\mathcal I_i$ denotes the forward neighbors of $i$.
Input
M
– aPowerManifold
manifoldx
– a point.
Ouput
- ξ – resulting tangent vector in $T_x\mathcal M$.
Manopt.∇TV
— Method∇TV(M,(x,y),[p=1])
compute the (sub) gradient of $\frac{1}{p}d^p_{\mathcal M}(x,y)$ with respect to both $x$ and $y$.
Manopt.∇TV2
— Function∇TV2(M,q [,p=1])
computes the (sub) gradient of $\frac{1}{p}d_2^p(x_1,x_2,x_3)$ with respect to all $x_1,x_2,x_3$ occuring along any array dimension in the point x
, where M
is the corresponding PowerManifold
.
Manopt.∇TV2
— Method∇TV2(M,(x,y,z),p)
computes the (sub) gradient of $\frac{1}{p}d_2^p(x,y,z)$ with respect to $x$, $y$, and $z$, where $d_2$ denotes the second order absolute difference using the mid point model, i.e. let
denote the mid points between $x$ and $z$ on the manifold $\mathcal M$. Then the absolute second order difference is defined as
While the (sub)gradient with respect to $y$ is easy, the other two require the evaluation of an adjoint_Jacobi_field
. See Illustration of the Gradient of a Second Order Difference for its derivation.
Manopt.∇acceleration_bezier
— Method∇acceleration_bezier(
M::Manifold,
B::AbstractVector{P},
degrees::AbstractVector{<:Integer}
T::AbstractVector{<:AbstractFloat}
)
compute the gradient of the discretized acceleration of a (composite) Bézier curve $c_B(t)$ on the Manifold
M
with respect to its control points B
given as a point on the PowerManifold
assuming C1 conditions and known degrees
. The curve is evaluated at the points given in T
(elementwise in $[0,N]$, where $N$ is the number of segments of the Bézier curve). The get_bezier_junctions
are fixed for this gradient (interpolation constraint). For the unconstrained gradient, see ∇L2_acceleration_bezier
and set $λ=0$ therein. This gradient is computed using adjoint_Jacobi_field
s. For details, see [BergmannGousenbourger2018]. See de_casteljau
for more details on the curve.
See also
cost_acceleration_bezier
, ∇L2_acceleration_bezier
, cost_L2_acceleration_bezier
.
Manopt.∇distance
— Function∇distance(M,y,x[, p=2])
compute the (sub)gradient of the distance (squared)
to a fixed point y
on the manifold M
and p
is an integer. The gradient reads
for $p\neq 1$ or $x\neq y$. Note that for the remaining case $p=1$, $x=y$ the function is not differentiable. In this case, the function returns the corresponding zero tangent vector, since this is an element of the subdifferential.
Optional
p
– (2
) the exponent of the distance, i.e. the default is the squared distance
Manopt.∇intrinsic_infimal_convolution_TV12
— Method∇u,∇v = ∇intrinsic_infimal_convolution_TV12(M,f,u,v,α,β)
compute (sub)gradient of the intrinsic infimal convolution model using the mid point model of second order differences, see costTV2
, i.e. for some $f ∈ \mathcal M$ on a PowerManifold
manifold $\mathcal M$ this function computes the (sub)gradient of
where both total variations refer to the intrinsic ones, ∇TV
and ∇TV2
, respectively.
- BergmannGousenbourger2018
Bergmann, R. and Gousenbourger, P.-Y.: A variational model for data fitting on manifolds by minimizing the acceleration of a Bézier curve. Frontiers in Applied Mathematics and Statistics (2018). doi 10.3389/fams.2018.00059, arXiv: 1807.10090