# Gradients

For a function $f\colon\mathcal M\to\mathbb R$ the Riemannian gradient $\nabla f(x)$ at $x∈\mathcal M$ is given by the unique tangent vector fulfilling

where $D_xf[\xi]$ denotes the differential of $f$ at $x$ with respect to the tangent direction (vector) $\xi$ or in other words the directional derivative.

This page collects the available gradients.

`Manopt.forward_logs`

— Method`ξ = forward_logs(M,x)`

compute the forward logs $F$ (generalizing forward differences) orrucirng, in the power manifold array, the function

where $\mathcal G$ is the set of indices of the `PowerManifold`

manifold `M`

and $\mathcal I_i$ denotes the forward neighbors of $i$.

**Input**

`M`

– a`PowerManifold`

manifold`x`

– a point.

**Ouput**

`ξ`

– resulting tangent vector in $T_x\mathcal M$ representing the logs, where $\mathcal N$ is thw power manifold with the number of dimensions added to`size(x)`

.

`Manopt.∇TV`

— Function`ξ = ∇TV(M,λ,x,[p])`

Compute the (sub)gradient $\partial F$ of all forward differences orrucirng, in the power manifold array, i.e. of the function

where $i$ runs over all indices of the `PowerManifold`

manifold `M`

and $\mathcal I_i$ denotes the forward neighbors of $i$.

**Input**

`M`

– a`PowerManifold`

manifold`x`

– a point.

**Ouput**

- ξ – resulting tangent vector in $T_x\mathcal M$.

`Manopt.∇TV`

— Method`∇TV(M,(x,y),[p=1])`

compute the (sub) gradient of $\frac{1}{p}d^p_{\mathcal M}(x,y)$ with respect to both $x$ and $y$.

`Manopt.∇TV2`

— Function`∇TV2(M,q [,p=1])`

computes the (sub) gradient of $\frac{1}{p}d_2^p(x_1,x_2,x_3)$ with respect to all $x_1,x_2,x_3$ occuring along any array dimension in the point `x`

, where `M`

is the corresponding `PowerManifold`

.

`Manopt.∇TV2`

— Method`∇TV2(M,(x,y,z),p)`

computes the (sub) gradient of $\frac{1}{p}d_2^p(x,y,z)$ with respect to $x$, $y$, and $z$, where $d_2$ denotes the second order absolute difference using the mid point model, i.e. let

denote the mid points between $x$ and $z$ on the manifold $\mathcal M$. Then the absolute second order difference is defined as

While the (sub)gradient with respect to $y$ is easy, the other two require the evaluation of an `adjoint_Jacobi_field`

. See Illustration of the Gradient of a Second Order Difference for its derivation.

`Manopt.∇distance`

— Function`∇distance(M,y,x[, p=2])`

compute the (sub)gradient of the distance (squared)

to a fixed point `y`

on the manifold `M`

and `p`

is an integer. The gradient reads

for $p\neq 1$ or $x\neq y$. Note that for the remaining case $p=1$, $x=y$ the function is not differentiable. In this case, the function returns the corresponding zero tangent vector, since this is an element of the subdifferential.

**Optional**

`p`

– (`2`

) the exponent of the distance, i.e. the default is the squared distance

`Manopt.∇intrinsic_infimal_convolution_TV12`

— Method`∇u,∇v = ∇intrinsic_infimal_convolution_TV12(M,f,u,v,α,β)`

compute (sub)gradient of the intrinsic infimal convolution model using the mid point model of second order differences, see `costTV2`

, i.e. for some $f ∈ \mathcal M$ on a `PowerManifold`

manifold $\mathcal M$ this function computes the (sub)gradient of

where both total variations refer to the intrinsic ones, `∇TV`

and `∇TV2`

, respectively.