Douglas–Rachford Algorithm
The (Parallel) Douglas–Rachford ((P)DR) Algorithm was generalized to Hadamard manifolds in [Bergmann, Persch, Steidl, 2016].
The aim is to minimize the sum
\[F(x) = f(x) + g(x)\]
on a manifold, where the two summands have proximal maps $\operatorname{prox}_{λ f}, \operatorname{prox}_{λ g}$ that are easy to evaluate (maybe in closed form, or not too costly to approximate). Further, define the reflection operator at the proximal map as
\[\operatorname{refl}_{λ f}(x) = \exp_{\operatorname{prox}_{λ f}(x)} \bigl( -\log_{\operatorname{prox}_{λ f}(x)} x \bigr)\]
.
Let $\alpha_k ∈ [0,1]$ with $\sum_{k ∈ \mathbb N} \alpha_k(1-\alpha_k) = ∈ fty$ and $λ > 0$ which might depend on iteration $k$ as well) be given.
Then the (P)DRA algorithm for initial data $x_0 ∈ \mathcal H$ as
Initialization
Initialize $t_0 = x_0$ and $k=0$
Iteration
Repeat until a convergence criterion is reached
- Compute $s_k = \operatorname{refl}_{λ f}\operatorname{refl}_{λ g}(t_k)$
- Within that operation, store $x_{k+1} = \operatorname{prox}_{λ g}(t_k)$ which is the prox the inner reflection reflects at.
- Compute $t_{k+1} = g(\alpha_k; t_k, s_k)$
- Set $k = k+1$
Result
The result is given by the last computed $x_K$.
For the parallel version, the first proximal map is a vectorial version where in each component one prox is applied to the corresponding copy of $t_k$ and the second proximal map corresponds to the indicator function of the set, where all copies are equal (in $\mathcal H^n$, where $n$ is the number of copies), leading to the second prox being the Riemannian mean.
Interface
Manopt.DouglasRachford
— Function DouglasRachford(M, F, proxMaps, x)
Computes the Douglas-Rachford algorithm on the manifold $\mathcal M$, initial data $x_0$ and the (two) proximal maps proxMaps
.
For $k>2$ proximal maps the problem is reformulated using the parallelDouglasRachford: a vectorial proximal map on the power manifold $\mathcal M^k$ and the proximal map of the set that identifies all entries again, i.e. the Karcher mean. This henve also boild down to two proximal maps, though each evauates proximal maps in parallel, i.e. component wise in a vector.
Input
M
– a Riemannian Manifold $\mathcal M$F
– a cost function consisting of a sum of cost functionsproxes
– functions of the form(λ,x)->...
performing a proximal map, whereλ
denotes the proximal parameter, for each of the summands ofF
.x0
– initial data $x_0 ∈ \mathcal M$
Optional values
the default parameter is given in brackets
evaluation
– (AllocatingEvaluation
) specify whether the proximal maps work by allocation (default) formprox(M, λ, x)
orMutatingEvaluation
in place, i.e. is of the formprox!(M, y, λ, x)
.λ
– ((iter) -> 1.0
) function to provide the value for the proximal parameter during the callsα
– ((iter) -> 0.9
) relaxation of the step from old to new iterate, i.e. $t_{k+1} = g(α_k; t_k, s_k)$, where $s_k$ is the result of the double reflection involved in the DR algorithmR
– (reflect
) method employed in the iteration to perform the reflection ofx
at the proxp
.stopping_criterion
– (StopWhenAny
(
StopAfterIteration
(200),
StopWhenChangeLess
(10.0^-5))
) aStoppingCriterion
.parallel
– (false
) clarify that we are doing a parallel DR, i.e. on aPowerManifold
manifold with two proxes. This can be used to trigger parallel Douglas–Rachford if you enter with two proxes. Keep in mind, that a parallel Douglas–Rachford implicitly works on aPowerManifold
manifold and its first argument is the result then (assuming all are equal after the second prox.
and the ones that are passed to decorate_options
for decorators.
Output
the obtained (approximate) minimizer $x^*$, see get_solver_return
for details
Manopt.DouglasRachford!
— Function DouglasRachford(M, F, proxMaps, x)
Computes the Douglas-Rachford algorithm on the manifold $\mathcal M$, initial data $x_0$ and the (two) proximal maps proxMaps
in place of x
.
For $k>2$ proximal maps the problem is reformulated using the parallelDouglasRachford: a vectorial proximal map on the power manifold $\mathcal M^k$ and the proximal map of the set that identifies all entries again, i.e. the Karcher mean. This hence also boils down to two proximal maps, though each evaluates proximal maps in parallel, i.e. component wise in a vector.
Input
M
– a Riemannian Manifold $\mathcal M$F
– a cost function consisting of a sum of cost functionsproxes
– functions of the form(λ,x)->...
performing a proximal map, whereλ
denotes the proximal parameter, for each of the summands ofF
.x0
– initial data $x_0 ∈ \mathcal M$
For more options, see DouglasRachford
.
Options
Manopt.DouglasRachfordOptions
— TypeDouglasRachfordOptions <: Options
Store all options required for the DouglasRachford algorithm,
Fields
x
- the current iterate (result) For the parallel Douglas-Rachford, this is not a value from thePowerManifold
manifold but the mean.s
– the last result of the double reflection at the proxes relaxed byα
.λ
– function to provide the value for the proximal parameter during the callsα
– relaxation of the step from old to new iterate, i.e. $x^{(k+1)} = g(α(k); x^{(k)}, t^{(k)})$, where $t^{(k)}$ is the result of the double reflection involved in the DR algorithmR
– method employed in the iteration to perform the reflection ofx
at the proxp
.stop
– aStoppingCriterion
parallel
– indicate whether we are running a parallel Douglas-Rachford or not.
Constructor
DouglasRachfordOptions(M, p; kwargs...)
Generate the options for a Manifold M
and an initial point p
, where the following keyword arguments can be used
λ
– ((iter)->1.0
) function to provide the value for the proximal parameter during the callsα
– ((iter)->0.9
) relaxation of the step from old to new iterate, i.e. $x^{(k+1)} = g(α(k); x^{(k)}, t^{(k)})$, where $t^{(k)}$ is the result of the double reflection involved in the DR algorithmR
– (reflect
) method employed in the iteration to perform the reflection ofx
at the proxp
.stopping_criterion
– (StopAfterIteration
(300)
) aStoppingCriterion
parallel
– (false
) indicate whether we are running a parallel Douglas-Rachford or not.
For specific DebugAction
s and RecordAction
s see also Cyclic Proximal Point.
Literature
- [Bergmann, Persch, Steidl, 2016]
Bergmann, R; Persch, J.; Steidl, G.:
A Parallel Douglas–Rachford Algorithm for Minimizing ROF-like Functionals on Images with Values in Symmetric Hadamard Manifolds. SIAM Journal on Imaging Sciences, Volume 9, Number 3, pp. 901–937, 2016. doi: 10.1137/15M1052858, arXiv: 1512.02814.