How to record data during the iterations

Ronny Bergmann

The recording and debugging features make it possible to record nearly any data during the iterations. This tutorial illustrates how to:

  • record one value during the iterations;
  • record multiple values during the iterations and access them afterwards;
  • record within a subsolver
  • define an own RecordAction to perform individual recordings.

Several predefined recordings exist, for example RecordCost or RecordGradient, if the problem the solver uses provides a gradient. For fields of the State the recording can also be done RecordEntry. For other recordings, for example more advanced computations before storing a value, an own RecordAction can be defined.

We illustrate these using the gradient descent from the Get started: optimize! tutorial.

Here the focus is put on ways to investigate the behaviour during iterations by using Recording techniques.

Let’s first load the necessary packages.

using Manopt, Manifolds, Random, ManifoldDiff, LinearAlgebra
using ManifoldDiff: grad_distance
Random.seed!(42);

The Objective

We generate data and define our cost and gradient:

Random.seed!(42)
m = 30
M = Sphere(m)
n = 800
σ = π / 8
x = zeros(Float64, m + 1)
x[2] = 1.0
data = [exp(M, x, σ * rand(M; vector_at=x)) for i in 1:n]
f(M, p) = sum(1 / (2 * n) * distance.(Ref(M), Ref(p), data) .^ 2)
grad_f(M, p) = sum(1 / n * grad_distance.(Ref(M), data, Ref(p)))
grad_f (generic function with 1 method)

Plain Examples

For the high level interfaces of the solvers, like gradient_descent we have to set return_state to true to obtain the whole solver state and not only the resulting minimizer.

Then we can easily use the record= option to add recorded values. This keyword accepts RecordActions as well as several symbols as shortcuts, for example :Cost to record the cost, or if your options have a field f, :f would record that entry. An overview of the symbols that can be used is given here.

We first just record the cost after every iteration

R = gradient_descent(M, f, grad_f, data[1]; record=:Cost, return_state=true)
# Solver state for `Manopt.jl`s Gradient Descent
After 60 iterations

## Parameters
* retraction method: ExponentialRetraction()

## Stepsize
ArmijoLinesearch() with keyword parameters
  * initial_stepsize    = 1.0
  * retraction_method   = ExponentialRetraction()
  * contraction_factor  = 0.95
  * sufficient_decrease = 0.1

## Stopping criterion

Stop When _one_ of the following are fulfilled:
    Max Iteration 200:  not reached
    |grad f| < 1.0e-8: reached
Overall: reached
This indicates convergence: Yes

## Record
(Iteration = RecordCost(),)

From the returned state, we see that the GradientDescentState are encapsulated (decorated) within a RecordSolverState.

For such a state, one can attach different recorders to some operations, currently to :Start. :Stop, and :Iteration, where :Iteration is the default when using the record= keyword with a RecordAction as above. We can access all values recorded during the iterations by calling get_record(R, :Iteation) or since this is the default even shorter

get_record(R)
60-element Vector{Float64}:
 0.6868754085841272
 0.6240211444102516
 0.5900374782569905
 0.5691425134106757
 0.5512819383843195
 0.542136810022984
 0.5374585627386623
 0.5350045365259574
 0.5337243124406585
 0.5330491236590466
 0.5326944302021914
 0.5325071127227716
 0.5324084047176342
 ⋮
 0.5322977905736846
 0.5322977905736771
 0.5322977905736733
 0.5322977905736712
 0.5322977905736699
 0.5322977905736691
 0.5322977905736687
 0.5322977905736684
 0.5322977905736683
 0.5322977905736682
 0.5322977905736681
 0.5322977905736681

To record more than one value, you can pass an array of a mix of symbols and RecordActions which formally introduces RecordGroup. Such a group records a tuple of values in every iteration:

R2 = gradient_descent(M, f, grad_f, data[1]; record=[:Iteration, :Cost], return_state=true)
# Solver state for `Manopt.jl`s Gradient Descent
After 60 iterations

## Parameters
* retraction method: ExponentialRetraction()

## Stepsize
ArmijoLinesearch() with keyword parameters
  * initial_stepsize    = 1.0
  * retraction_method   = ExponentialRetraction()
  * contraction_factor  = 0.95
  * sufficient_decrease = 0.1

## Stopping criterion

Stop When _one_ of the following are fulfilled:
    Max Iteration 200:  not reached
    |grad f| < 1.0e-8: reached
Overall: reached
This indicates convergence: Yes

## Record
(Iteration = RecordGroup([RecordIteration(), RecordCost()]),)

Here, the symbol :Cost is mapped to using the RecordCost action. The same holds for :Iteration obviously records the current iteration number i. To access these you can first extract the group of records (that is where the :Iterations are recorded; note the plural) and then access the :Cost ““”

get_record_action(R2, :Iteration)
RecordGroup([RecordIteration(), RecordCost()])

Since iteration is the default, we can also omit it here again. To access single recorded values, one can use

get_record_action(R2)[:Cost]
60-element Vector{Float64}:
 0.6868754085841272
 0.6240211444102516
 0.5900374782569905
 0.5691425134106757
 0.5512819383843195
 0.542136810022984
 0.5374585627386623
 0.5350045365259574
 0.5337243124406585
 0.5330491236590466
 0.5326944302021914
 0.5325071127227716
 0.5324084047176342
 ⋮
 0.5322977905736846
 0.5322977905736771
 0.5322977905736733
 0.5322977905736712
 0.5322977905736699
 0.5322977905736691
 0.5322977905736687
 0.5322977905736684
 0.5322977905736683
 0.5322977905736682
 0.5322977905736681
 0.5322977905736681

This can be also done by using a the high level interface get_record

get_record(R2, :Iteration, :Cost)
60-element Vector{Float64}:
 0.6868754085841272
 0.6240211444102516
 0.5900374782569905
 0.5691425134106757
 0.5512819383843195
 0.542136810022984
 0.5374585627386623
 0.5350045365259574
 0.5337243124406585
 0.5330491236590466
 0.5326944302021914
 0.5325071127227716
 0.5324084047176342
 ⋮
 0.5322977905736846
 0.5322977905736771
 0.5322977905736733
 0.5322977905736712
 0.5322977905736699
 0.5322977905736691
 0.5322977905736687
 0.5322977905736684
 0.5322977905736683
 0.5322977905736682
 0.5322977905736681
 0.5322977905736681

Note that the first symbol again refers to the point where we record (not to the thing we record). We can also pass a tuple as second argument to have our own order within the tuples returned. Switching the order of recorded cost and Iteration can be done using ““”

get_record(R2, :Iteration, (:Iteration, :Cost))
60-element Vector{Tuple{Int64, Float64}}:
 (1, 0.6868754085841272)
 (2, 0.6240211444102516)
 (3, 0.5900374782569905)
 (4, 0.5691425134106757)
 (5, 0.5512819383843195)
 (6, 0.542136810022984)
 (7, 0.5374585627386623)
 (8, 0.5350045365259574)
 (9, 0.5337243124406585)
 (10, 0.5330491236590466)
 (11, 0.5326944302021914)
 (12, 0.5325071127227716)
 (13, 0.5324084047176342)
 ⋮
 (49, 0.5322977905736846)
 (50, 0.5322977905736771)
 (51, 0.5322977905736733)
 (52, 0.5322977905736712)
 (53, 0.5322977905736699)
 (54, 0.5322977905736691)
 (55, 0.5322977905736687)
 (56, 0.5322977905736684)
 (57, 0.5322977905736683)
 (58, 0.5322977905736682)
 (59, 0.5322977905736681)
 (60, 0.5322977905736681)

A more complex example

To illustrate a complicated example let’s record:

  • the iteration number, cost and gradient field, but only every sixth iteration;
  • the iteration at which we stop.

We first generate the problem and the state, to also illustrate the low-level works when not using the high-level interface gradient_descent.

p = DefaultManoptProblem(M, ManifoldGradientObjective(f, grad_f))
s = GradientDescentState(
    M,
    copy(data[1]);
    stopping_criterion=StopAfterIteration(200) | StopWhenGradientNormLess(10.0^-9),
)
# Solver state for `Manopt.jl`s Gradient Descent

## Parameters
* retraction method: ExponentialRetraction()

## Stepsize
ArmijoLinesearch() with keyword parameters
  * initial_stepsize    = 1.0
  * retraction_method   = ExponentialRetraction()
  * contraction_factor  = 0.95
  * sufficient_decrease = 0.1

## Stopping criterion

Stop When _one_ of the following are fulfilled:
    Max Iteration 200:  not reached
    |grad f| < 1.0e-9: not reached
Overall: not reached
This indicates convergence: No

We now first build a RecordGroup to group the three entries we want to record per iteration. We then put this into a RecordEvery to only record this every sixth iteration

rI = RecordEvery(
    RecordGroup([
        RecordIteration() => :Iteration,
        RecordCost() => :Cost,
        RecordEntry(similar(data[1]), :X) => :Gradient,
    ]),
    6,
)
RecordEvery(RecordGroup([RecordIteration(), RecordCost(), RecordEntry(:X)]), 6, true)

where the notation as a pair with the symbol can be read as “Is accessible by”. The record= keyword with the symbol :Iteration is actually the same as we specified here for the first group entry. For recording the final iteration number

sI = RecordIteration()
RecordIteration()

We now combine both into the RecordSolverState decorator. It acts completely the same as any AbstractManoptSolverState but records something in every iteration additionally. This is stored in a dictionary of RecordActions, where :Iteration is the action (here the only every sixth iteration group) and the sI which is executed at stop.

Note that the keyword record= in the high level interface gradient_descent only would fill the :Iteration symbol of said dictionary, but we could also pass pairs like in the following, that is in the form Symbol => RecordAction into that keyword to obtain the same as in

r = RecordSolverState(s, Dict(:Iteration => rI, :Stop => sI))
# Solver state for `Manopt.jl`s Gradient Descent

## Parameters
* retraction method: ExponentialRetraction()

## Stepsize
ArmijoLinesearch() with keyword parameters
  * initial_stepsize    = 1.0
  * retraction_method   = ExponentialRetraction()
  * contraction_factor  = 0.95
  * sufficient_decrease = 0.1

## Stopping criterion

Stop When _one_ of the following are fulfilled:
    Max Iteration 200:  not reached
    |grad f| < 1.0e-9: not reached
Overall: not reached
This indicates convergence: No

## Record
(Iteration = RecordEvery(RecordGroup([RecordIteration(), RecordCost(), RecordEntry(:X)]), 6, true), Stop = RecordIteration())

We now call the solver

res = solve!(p, r)
# Solver state for `Manopt.jl`s Gradient Descent
After 65 iterations

## Parameters
* retraction method: ExponentialRetraction()

## Stepsize
ArmijoLinesearch() with keyword parameters
  * initial_stepsize    = 1.0
  * retraction_method   = ExponentialRetraction()
  * contraction_factor  = 0.95
  * sufficient_decrease = 0.1

## Stopping criterion

Stop When _one_ of the following are fulfilled:
    Max Iteration 200:  not reached
    |grad f| < 1.0e-9: reached
Overall: reached
This indicates convergence: Yes

## Record
(Iteration = RecordEvery(RecordGroup([RecordIteration(), RecordCost(), RecordEntry(:X)]), 6, true), Stop = RecordIteration())

And we can check the recorded value at :Stop to see how many iterations were performed

get_record(res, :Stop)
1-element Vector{Int64}:
 65

and the other values during the iterations are

get_record(res, :Iteration, (:Iteration, :Cost))
10-element Vector{Tuple{Int64, Float64}}:
 (6, 0.542136810022984)
 (12, 0.5325071127227716)
 (18, 0.5323023757104093)
 (24, 0.5322978928223224)
 (30, 0.5322977928970517)
 (36, 0.5322977906274986)
 (42, 0.5322977905749401)
 (48, 0.5322977905736989)
 (54, 0.5322977905736691)
 (60, 0.5322977905736681)

where the last tuple contains the names from the pairs when we generated the record group. So similarly we can use :Gradient as specified before to access the recorded gradient.

Recording from a Subsolver

One can also record from a subsolver. For that we need a problem that actually requires a subsolver. We take the constraint example from the How to print debug tutorial. Maybe read that part for more details on the problem

d = 4
M2 = Sphere(d - 1)
v0 = project(M2, [ones(2)..., zeros(d - 2)...])
Z = v0 * v0'
#Cost and gradient
f2(M, p) = -tr(transpose(p) * Z * p) / 2
grad_f2(M, p) = project(M, p, -transpose.(Z) * p / 2 - Z * p / 2)
# Constraints
g(M, p) = -p # now p ≥ 0
mI = -Matrix{Float64}(I, d, d)
# Vector of gradients of the constraint components
grad_g(M, p) = [project(M, p, mI[:, i]) for i in 1:d]
p0 = project(M2, [ones(2)..., zeros(d - 3)..., 0.1])

We directly start with recording the subsolvers Iteration. We can specify what to record in the subsolver using the sub_kwargs keyword argument with a Symbol => value pair. Here we specify to record the iteration and the cost in every subsolvers step.

Furthermore, we have to “collect” this recording after every sub solver run. This is done with the :Subsolver keyword in the main record= keyword.

s1 = exact_penalty_method(
    M2,
    f2,
    grad_f2,
    p0;
    g = g,
    grad_g = grad_g,
    record = [:Iteration, :Cost, :Subsolver],
    sub_kwargs = [:record => [:Iteration, :Cost]],
    return_state=true,
);

Then the first entry of the record containts the iterate, the (main solvers) cost, and the third entry is the recording of the subsolver.

get_record(s1)[1]
(1, -0.4733019623455375, [(1, -0.4288382393589549), (2, -0.4366953425955692), (3, -0.43740366734999164), (4, -0.43744087180862923)])

When adding a number to not record on every iteration, the :Subsolver keyword of course still also only “copies over” the subsolver recordings when active. But one could avoid allocations on the other runs. This is done, by specifying the sub solver as :WhenActive

s2 = exact_penalty_method(
    M2,
    f2,
    grad_f2,
    p0;
    g = g,
    grad_g = grad_g,
    record = [:Iteration, :Cost, :Subsolver, 25],
    sub_kwargs = [:record => [:Iteration, :Cost, :WhenActive]],
    return_state=true,
);

Then

get_record(s2)
4-element Vector{Tuple{Int64, Float64, Vector{Tuple{Int64, Float64}}}}:
 (25, -0.499448691933161, [(1, -0.4991469209203163)])
 (50, -0.499995278186848, [(1, -0.4999937264503264)])
 (75, -0.49999997014871317, [(1, -0.49999996322535156)])
 (100, -0.49999999983996535, [(1, -0.49999999981148996)])

Finally, instead of recording iterations, we can also specify to record the stopping criterion and final cost by adding that to :Stop of the sub solvers record. Then we can specify – as so often in a tuple, that the :Subsolver should record :Stop (by devault it takes over :Iteration)

s3 = exact_penalty_method(
    M2,
    f2,
    grad_f2,
    p0;
    g = g,
    grad_g = grad_g,
    record = [:Iteration, :Cost, (:Subsolver, :Stop), 25],
    sub_kwargs = [:record => [:Stop => [:Stop, :Cost]]],
    return_state=true,
);

Then the following displays also the reasons why each of the recorded subsolvers stopped – and the corresponding cost

get_record(s3)
4-element Vector{Tuple{Int64, Float64, Vector{Tuple{String, Float64}}}}:
 (25, -0.499448691933161, [("The algorithm reached approximately critical point after 1 iterations; the gradient norm (0.00021218898527217448) is less than 0.001.\n", -0.4991469209203163)])
 (50, -0.499995278186848, [("The algorithm reached approximately critical point after 1 iterations; the gradient norm (1.6025009584517956e-5) is less than 0.001.\n", -0.4999937264503264)])
 (75, -0.49999997014871317, [("The algorithm reached approximately critical point after 1 iterations; the gradient norm (9.966301158136346e-7) is less than 0.001.\n", -0.49999996322535156)])
 (100, -0.49999999983996535, [("The algorithm reached approximately critical point after 1 iterations; the gradient norm (5.4875346930715234e-8) is less than 0.001.\n", -0.49999999981148996)])

Writing an own RecordActions

Let’s investigate where we want to count the number of function evaluations, again just to illustrate, since for the gradient this is just one evaluation per iteration. We first define a cost, that counts its own calls. ““”

mutable struct MyCost{T}
    data::T
    count::Int
end
MyCost(data::T) where {T} = MyCost{T}(data, 0)
function (c::MyCost)(M, x)
    c.count += 1
    return sum(1 / (2 * length(c.data)) * distance.(Ref(M), Ref(x), c.data) .^ 2)
end

and we define an own, new RecordAction, which is a functor, that is a struct that is also a function. The function we have to implement is similar to a single solver step in signature, since it might get called every iteration:

mutable struct RecordCount <: RecordAction
    recorded_values::Vector{Int}
    RecordCount() = new(Vector{Int}())
end
function (r::RecordCount)(p::AbstractManoptProblem, ::AbstractManoptSolverState, i)
    if i > 0
        push!(r.recorded_values, Manopt.get_cost_function(get_objective(p)).count)
    elseif i < 0 # reset if negative
        r.recorded_values = Vector{Int}()
    end
end

Now we can initialize the new cost and call the gradient descent. Note that this illustrates also the last use case since you can pass symbol-action pairs into the record=array.

f3 = MyCost(data)
MyCost{Vector{Vector{Float64}}}([[-0.054658825167894595, -0.5592077846510423, -0.04738273828111257, -0.04682080720921302, 0.12279468849667038, 0.07171438895366239, -0.12930045409417057, -0.22102081626380404, -0.31805333254577767, 0.0065859500152017645  …  -0.21999168261518043, 0.19570142227077295, 0.340909965798364, -0.0310802190082894, -0.04674431076254687, -0.006088297671169996, 0.01576037011323387, -0.14523596850249543, 0.14526158060820338, 0.1972125856685378], [-0.08192376929745249, -0.5097715132187676, -0.008339904915541005, 0.07289741328038676, 0.11422036270613797, -0.11546739299835748, 0.2296996932628472, 0.1490467170835958, -0.11124820565850364, -0.11790721606521781  …  -0.16421249630470344, -0.2450575844467715, -0.07570080850379841, -0.07426218324072491, -0.026520181327346338, 0.11555341205250205, -0.0292955762365121, -0.09012096853677576, -0.23470556634911574, -0.026214242996704013], [-0.22951484264859257, -0.6083825348640186, 0.14273766477054015, -0.11947823367023377, 0.05984293499234536, 0.058820835498203126, 0.07577331705863266, 0.1632847202946857, 0.20244385489915745, 0.04389826920203656  …  0.3222365119325929, 0.009728730325524067, -0.12094785371632395, -0.36322323926212824, -0.0689253407939657, 0.23356953371702974, 0.23489531397909744, 0.078303336494718, -0.14272984135578806, 0.07844539956202407], [-0.0012588500237817606, -0.29958740415089763, 0.036738459489123514, 0.20567651907595125, -0.1131046432541904, -0.06032435985370224, 0.3366633723165895, -0.1694687746143405, -0.001987171245125281, 0.04933779858684409  …  -0.2399584473006256, 0.19889267065775063, 0.22468755918787048, 0.1780090580180643, 0.023703860700539356, -0.10212737517121755, 0.03807004103115319, -0.20569120952458983, -0.03257704254233959, 0.06925473452536687], [-0.035534309946938375, -0.06645560787329002, 0.14823972268208874, -0.23913346587232426, 0.038347027875883496, 0.10453333143286662, 0.050933995140290705, -0.12319549375687473, 0.12956684644537844, -0.23540367869989412  …  -0.41471772859912864, -0.1418984610380257, 0.0038321446836859334, 0.23655566917750157, -0.17500681300994742, -0.039189751036839374, -0.08687860620942896, -0.11509948162959047, 0.11378233994840942, 0.38739450723013735], [-0.3122539912469438, -0.3101935557860296, 0.1733113629107006, 0.08968593616209351, -0.1836344261367962, -0.06480023695256802, 0.18165070013886545, 0.19618275767992124, -0.07956460275570058, 0.0325997354656551  …  0.2845492418767769, 0.17406455870721682, -0.053101230371568706, -0.1382082812981627, 0.005830071475508364, 0.16739264037923055, 0.034365814374995335, 0.09107702398753297, -0.1877250428700409, 0.05116494897806923], [-0.04159442361185588, -0.7768029783272633, 0.06303616666722486, 0.08070518925253539, -0.07396265237309446, -0.06008109299719321, 0.07977141629715745, 0.019511027129056415, 0.08629917589924847, -0.11156298867318722  …  0.0792587504128044, -0.016444383900170008, -0.181746064577005, -0.01888129512990984, -0.13523922089388968, 0.11358102175659832, 0.07929049608459493, 0.1689565359083833, 0.07673657951723721, -0.1128480905648813], [-0.21221814304651335, -0.5031823821503253, 0.010326342133992458, -0.12438192100961257, 0.04004758695231872, 0.2280527500843805, -0.2096243232022162, -0.16564828762420294, -0.28325749481138984, 0.17033534605245823  …  -0.13599096505924074, 0.28437770540525625, 0.08424426798544583, -0.1266207606984139, 0.04917635557603396, -0.00012608938533809706, -0.04283220254770056, -0.08771365647566572, 0.14750169103093985, 0.11601120086036351], [0.10683290707435536, -0.17680836277740156, 0.23767458301899405, 0.12011180867097299, -0.029404774462600154, 0.11522028383799933, -0.3318174480974519, -0.17859266746938374, 0.04352373642537759, 0.2530382802667988  …  0.08879861736692073, -0.004412506987801729, 0.19786810509925895, -0.1397104682727044, 0.09482328498485094, 0.05108149065160893, -0.14578343506951633, 0.3167479772660438, 0.10422673169182732, 0.21573150015891313], [-0.024895624707466164, -0.7473912016432697, -0.1392537238944721, -0.14948896791465557, -0.09765393283580377, 0.04413059403279867, -0.13865379004720355, -0.071032040283992, 0.15604054722246585, -0.10744260463413555  …  -0.14748067081342833, -0.14743635071251024, 0.0643591937981352, 0.16138827697852615, -0.12656652133603935, -0.06463635704869083, 0.14329582429103488, -0.01113113793821713, 0.29295387893749997, 0.06774523575259782]  …  [0.011874845316569967, -0.6910596618389588, 0.21275741439477827, -0.014042545524367437, -0.07883613103495014, -0.0021900966696246776, -0.033836430464220496, 0.2925813113264835, -0.04718187201980008, 0.03949680289730036  …  0.0867736586603294, 0.0404682510051544, -0.24779813848587257, -0.28631514602877145, -0.07211767532456789, -0.15072898498180473, 0.017855923621826746, -0.09795357710255254, -0.14755229203084924, 0.1305005778855436], [0.013457629515450426, -0.3750353654626534, 0.12349883726772073, 0.3521803555005319, 0.2475921439420274, 0.006088649842999206, 0.31203183112392907, -0.036869203979483754, -0.07475746464056504, -0.029297797064479717  …  0.16867368684091563, -0.09450564983271922, -0.0587273302122711, -0.1326667940553803, -0.25530237980444614, 0.37556905374043376, 0.04922612067677609, 0.2605362549983866, -0.21871556587505667, -0.22915883767386164], [0.03295085436260177, -0.971861604433394, 0.034748713521512035, -0.0494065013245799, -0.01767479281403355, 0.0465459739459587, 0.007470494722096038, 0.003227960072276129, 0.0058328596338402365, -0.037591237446692356  …  0.03205152122876297, 0.11331109854742015, 0.03044900529526686, 0.017971704993311105, -0.009329252062960229, -0.02939354719650879, 0.022088835776251863, -0.02546111553658854, -0.0026257225461427582, 0.005702111697172774], [0.06968243992532257, -0.7119502191435176, -0.18136614593117445, -0.1695926215673451, 0.01725015359973796, -0.00694164951158388, -0.34621134287344574, 0.024709256792651912, -0.1632255805999673, -0.2158226433583082  …  -0.14153772108081458, -0.11256850346909901, 0.045109821764180706, -0.1162754336222613, -0.13221711766357983, 0.005365354776191061, 0.012750671705879105, -0.018208207549835407, 0.12458753932455452, -0.31843587960340897], [-0.19830349374441875, -0.6086693423968884, 0.08552341811170468, 0.35781519334042255, 0.15790663648524367, 0.02712571268324985, 0.09855601327331667, -0.05840653973421127, -0.09546429767790429, -0.13414717696055448  …  -0.0430935804718714, 0.2678584478951765, 0.08780994289014614, 0.01613469379498457, 0.0516187906322884, -0.07383067566731401, -0.1481272738354552, -0.010532317187265649, 0.06555344745952187, -0.1506167863762911], [-0.04347524125197773, -0.6327981074196994, -0.221116680035191, 0.0282207467940456, -0.0855024881522933, 0.12821801740178346, 0.1779499563280024, -0.10247384887512365, 0.0396432464100116, -0.0582580338112627  …  0.1253893207083573, 0.09628202269764763, 0.3165295473947355, -0.14915034201394833, -0.1376727867817772, -0.004153096613530293, 0.09277957650773738, 0.05917264554031624, -0.12230262590034507, -0.19655728521529914], [-0.10173946348675116, -0.6475660153977272, 0.1260284619729566, -0.11933160462857616, -0.04774310633937567, 0.09093928358804217, 0.041662676324043114, -0.1264739543938265, 0.09605293126911392, -0.16790474428001648  …  -0.04056684573478108, 0.09351665120940456, 0.15259195558799882, 0.0009949298312580497, 0.09461980828206303, 0.3067004514287283, 0.16129258773733715, -0.18893664085007542, -0.1806865244492513, 0.029319680436405825], [-0.251780954320053, -0.39147463259941456, -0.24359579328578626, 0.30179309757665723, 0.21658893985206484, 0.12304585275893232, 0.28281133086451704, 0.029187615341955325, 0.03616243507191924, 0.029375588909979152  …  -0.08071746662465404, -0.2176101928258658, 0.20944684921170825, 0.043033273425352715, -0.040505542460853576, 0.17935596149079197, -0.08454569418519972, 0.0545941597033932, 0.12471741052450099, -0.24314124407858329], [0.28156471341150974, -0.6708572780452595, -0.1410302363738465, -0.08322589397277698, -0.022772599832907418, -0.04447265789199677, -0.016448068022011157, -0.07490911512503738, 0.2778432295769144, -0.10191899088372378  …  -0.057272155080983836, 0.12817478092201395, 0.04623814480781884, -0.12184190164369117, 0.1987855635987229, -0.14533603246124993, -0.16334072868597016, -0.052369977381939437, 0.014904286931394959, -0.2440882678882144], [0.12108727495744157, -0.714787344982596, 0.01632521838262752, 0.04437570556908449, -0.041199280304144284, 0.052984488452616, 0.03796520200156107, 0.2791785910964288, 0.11530429924056099, 0.12178223160398421  …  -0.07621847481721669, 0.18353870423743013, -0.19066653731436745, -0.09423224997242206, 0.14596847781388494, -0.09747986927777111, 0.16041150122587072, -0.02296513951256738, 0.06786878373578588, 0.15296635978447756]], 0)

Now for the plain gradient descent, we have to modify the step (to a constant stepsize) and remove the default check whether the cost increases (setting debug to []). We also only look at the first 20 iterations to keep this example small in recorded values. We call

R3 = gradient_descent(
    M,
    f3,
    grad_f,
    data[1];
    record=[:Iteration => [
        :Iteration,
        RecordCount() => :Count,
        :Cost],
    ],
    stepsize = ConstantStepsize(1.0),
    stopping_criterion=StopAfterIteration(20),
    debug=[],
    return_state=true,
)
:Iteration => Any[:Iteration, RecordCount(Int64[]) => :Count, :Cost]<-

# Solver state for `Manopt.jl`s Gradient Descent
After 20 iterations

## Parameters
* retraction method: ExponentialRetraction()

## Stepsize
ConstantStepsize(1.0, relative)

## Stopping criterion

Max Iteration 20:   reached
This indicates convergence: No

## Record
(Iteration = RecordGroup([RecordIteration(), RecordCount([1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20]), RecordCost()]),)

For :Cost we already learned how to access them, the => :Count introduces preceeding action to obtain the :Count symbol as its access. We can again access the whole sets of records

get_record(R3)
20-element Vector{Tuple{Int64, Int64, Float64}}:
 (1, 1, 0.5808287253777765)
 (2, 2, 0.5395268557323746)
 (3, 3, 0.5333529073733115)
 (4, 4, 0.5324514620174543)
 (5, 5, 0.5323201743667151)
 (6, 6, 0.5323010518577256)
 (7, 7, 0.5322982658416161)
 (8, 8, 0.532297859847447)
 (9, 9, 0.5322978006725337)
 (10, 10, 0.5322977920461375)
 (11, 11, 0.5322977907883957)
 (12, 12, 0.5322977906049865)
 (13, 13, 0.5322977905782369)
 (14, 14, 0.532297790574335)
 (15, 15, 0.5322977905737657)
 (16, 16, 0.5322977905736823)
 (17, 17, 0.5322977905736703)
 (18, 18, 0.5322977905736688)
 (19, 19, 0.5322977905736683)
 (20, 20, 0.5322977905736683)

this is equivalent to calling R[:Iteration]. Note that since we introduced :Count we can also access a single recorded value using

R3[:Iteration, :Count]
20-element Vector{Int64}:
  1
  2
  3
  4
  5
  6
  7
  8
  9
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
 20

and we see that the cost function is called once per iteration.

If we use this counting cost and run the default gradient descent with Armijo line search, we can infer how many Armijo line search backtracks are preformed:

f4 = MyCost(data)
MyCost{Vector{Vector{Float64}}}([[-0.054658825167894595, -0.5592077846510423, -0.04738273828111257, -0.04682080720921302, 0.12279468849667038, 0.07171438895366239, -0.12930045409417057, -0.22102081626380404, -0.31805333254577767, 0.0065859500152017645  …  -0.21999168261518043, 0.19570142227077295, 0.340909965798364, -0.0310802190082894, -0.04674431076254687, -0.006088297671169996, 0.01576037011323387, -0.14523596850249543, 0.14526158060820338, 0.1972125856685378], [-0.08192376929745249, -0.5097715132187676, -0.008339904915541005, 0.07289741328038676, 0.11422036270613797, -0.11546739299835748, 0.2296996932628472, 0.1490467170835958, -0.11124820565850364, -0.11790721606521781  …  -0.16421249630470344, -0.2450575844467715, -0.07570080850379841, -0.07426218324072491, -0.026520181327346338, 0.11555341205250205, -0.0292955762365121, -0.09012096853677576, -0.23470556634911574, -0.026214242996704013], [-0.22951484264859257, -0.6083825348640186, 0.14273766477054015, -0.11947823367023377, 0.05984293499234536, 0.058820835498203126, 0.07577331705863266, 0.1632847202946857, 0.20244385489915745, 0.04389826920203656  …  0.3222365119325929, 0.009728730325524067, -0.12094785371632395, -0.36322323926212824, -0.0689253407939657, 0.23356953371702974, 0.23489531397909744, 0.078303336494718, -0.14272984135578806, 0.07844539956202407], [-0.0012588500237817606, -0.29958740415089763, 0.036738459489123514, 0.20567651907595125, -0.1131046432541904, -0.06032435985370224, 0.3366633723165895, -0.1694687746143405, -0.001987171245125281, 0.04933779858684409  …  -0.2399584473006256, 0.19889267065775063, 0.22468755918787048, 0.1780090580180643, 0.023703860700539356, -0.10212737517121755, 0.03807004103115319, -0.20569120952458983, -0.03257704254233959, 0.06925473452536687], [-0.035534309946938375, -0.06645560787329002, 0.14823972268208874, -0.23913346587232426, 0.038347027875883496, 0.10453333143286662, 0.050933995140290705, -0.12319549375687473, 0.12956684644537844, -0.23540367869989412  …  -0.41471772859912864, -0.1418984610380257, 0.0038321446836859334, 0.23655566917750157, -0.17500681300994742, -0.039189751036839374, -0.08687860620942896, -0.11509948162959047, 0.11378233994840942, 0.38739450723013735], [-0.3122539912469438, -0.3101935557860296, 0.1733113629107006, 0.08968593616209351, -0.1836344261367962, -0.06480023695256802, 0.18165070013886545, 0.19618275767992124, -0.07956460275570058, 0.0325997354656551  …  0.2845492418767769, 0.17406455870721682, -0.053101230371568706, -0.1382082812981627, 0.005830071475508364, 0.16739264037923055, 0.034365814374995335, 0.09107702398753297, -0.1877250428700409, 0.05116494897806923], [-0.04159442361185588, -0.7768029783272633, 0.06303616666722486, 0.08070518925253539, -0.07396265237309446, -0.06008109299719321, 0.07977141629715745, 0.019511027129056415, 0.08629917589924847, -0.11156298867318722  …  0.0792587504128044, -0.016444383900170008, -0.181746064577005, -0.01888129512990984, -0.13523922089388968, 0.11358102175659832, 0.07929049608459493, 0.1689565359083833, 0.07673657951723721, -0.1128480905648813], [-0.21221814304651335, -0.5031823821503253, 0.010326342133992458, -0.12438192100961257, 0.04004758695231872, 0.2280527500843805, -0.2096243232022162, -0.16564828762420294, -0.28325749481138984, 0.17033534605245823  …  -0.13599096505924074, 0.28437770540525625, 0.08424426798544583, -0.1266207606984139, 0.04917635557603396, -0.00012608938533809706, -0.04283220254770056, -0.08771365647566572, 0.14750169103093985, 0.11601120086036351], [0.10683290707435536, -0.17680836277740156, 0.23767458301899405, 0.12011180867097299, -0.029404774462600154, 0.11522028383799933, -0.3318174480974519, -0.17859266746938374, 0.04352373642537759, 0.2530382802667988  …  0.08879861736692073, -0.004412506987801729, 0.19786810509925895, -0.1397104682727044, 0.09482328498485094, 0.05108149065160893, -0.14578343506951633, 0.3167479772660438, 0.10422673169182732, 0.21573150015891313], [-0.024895624707466164, -0.7473912016432697, -0.1392537238944721, -0.14948896791465557, -0.09765393283580377, 0.04413059403279867, -0.13865379004720355, -0.071032040283992, 0.15604054722246585, -0.10744260463413555  …  -0.14748067081342833, -0.14743635071251024, 0.0643591937981352, 0.16138827697852615, -0.12656652133603935, -0.06463635704869083, 0.14329582429103488, -0.01113113793821713, 0.29295387893749997, 0.06774523575259782]  …  [0.011874845316569967, -0.6910596618389588, 0.21275741439477827, -0.014042545524367437, -0.07883613103495014, -0.0021900966696246776, -0.033836430464220496, 0.2925813113264835, -0.04718187201980008, 0.03949680289730036  …  0.0867736586603294, 0.0404682510051544, -0.24779813848587257, -0.28631514602877145, -0.07211767532456789, -0.15072898498180473, 0.017855923621826746, -0.09795357710255254, -0.14755229203084924, 0.1305005778855436], [0.013457629515450426, -0.3750353654626534, 0.12349883726772073, 0.3521803555005319, 0.2475921439420274, 0.006088649842999206, 0.31203183112392907, -0.036869203979483754, -0.07475746464056504, -0.029297797064479717  …  0.16867368684091563, -0.09450564983271922, -0.0587273302122711, -0.1326667940553803, -0.25530237980444614, 0.37556905374043376, 0.04922612067677609, 0.2605362549983866, -0.21871556587505667, -0.22915883767386164], [0.03295085436260177, -0.971861604433394, 0.034748713521512035, -0.0494065013245799, -0.01767479281403355, 0.0465459739459587, 0.007470494722096038, 0.003227960072276129, 0.0058328596338402365, -0.037591237446692356  …  0.03205152122876297, 0.11331109854742015, 0.03044900529526686, 0.017971704993311105, -0.009329252062960229, -0.02939354719650879, 0.022088835776251863, -0.02546111553658854, -0.0026257225461427582, 0.005702111697172774], [0.06968243992532257, -0.7119502191435176, -0.18136614593117445, -0.1695926215673451, 0.01725015359973796, -0.00694164951158388, -0.34621134287344574, 0.024709256792651912, -0.1632255805999673, -0.2158226433583082  …  -0.14153772108081458, -0.11256850346909901, 0.045109821764180706, -0.1162754336222613, -0.13221711766357983, 0.005365354776191061, 0.012750671705879105, -0.018208207549835407, 0.12458753932455452, -0.31843587960340897], [-0.19830349374441875, -0.6086693423968884, 0.08552341811170468, 0.35781519334042255, 0.15790663648524367, 0.02712571268324985, 0.09855601327331667, -0.05840653973421127, -0.09546429767790429, -0.13414717696055448  …  -0.0430935804718714, 0.2678584478951765, 0.08780994289014614, 0.01613469379498457, 0.0516187906322884, -0.07383067566731401, -0.1481272738354552, -0.010532317187265649, 0.06555344745952187, -0.1506167863762911], [-0.04347524125197773, -0.6327981074196994, -0.221116680035191, 0.0282207467940456, -0.0855024881522933, 0.12821801740178346, 0.1779499563280024, -0.10247384887512365, 0.0396432464100116, -0.0582580338112627  …  0.1253893207083573, 0.09628202269764763, 0.3165295473947355, -0.14915034201394833, -0.1376727867817772, -0.004153096613530293, 0.09277957650773738, 0.05917264554031624, -0.12230262590034507, -0.19655728521529914], [-0.10173946348675116, -0.6475660153977272, 0.1260284619729566, -0.11933160462857616, -0.04774310633937567, 0.09093928358804217, 0.041662676324043114, -0.1264739543938265, 0.09605293126911392, -0.16790474428001648  …  -0.04056684573478108, 0.09351665120940456, 0.15259195558799882, 0.0009949298312580497, 0.09461980828206303, 0.3067004514287283, 0.16129258773733715, -0.18893664085007542, -0.1806865244492513, 0.029319680436405825], [-0.251780954320053, -0.39147463259941456, -0.24359579328578626, 0.30179309757665723, 0.21658893985206484, 0.12304585275893232, 0.28281133086451704, 0.029187615341955325, 0.03616243507191924, 0.029375588909979152  …  -0.08071746662465404, -0.2176101928258658, 0.20944684921170825, 0.043033273425352715, -0.040505542460853576, 0.17935596149079197, -0.08454569418519972, 0.0545941597033932, 0.12471741052450099, -0.24314124407858329], [0.28156471341150974, -0.6708572780452595, -0.1410302363738465, -0.08322589397277698, -0.022772599832907418, -0.04447265789199677, -0.016448068022011157, -0.07490911512503738, 0.2778432295769144, -0.10191899088372378  …  -0.057272155080983836, 0.12817478092201395, 0.04623814480781884, -0.12184190164369117, 0.1987855635987229, -0.14533603246124993, -0.16334072868597016, -0.052369977381939437, 0.014904286931394959, -0.2440882678882144], [0.12108727495744157, -0.714787344982596, 0.01632521838262752, 0.04437570556908449, -0.041199280304144284, 0.052984488452616, 0.03796520200156107, 0.2791785910964288, 0.11530429924056099, 0.12178223160398421  …  -0.07621847481721669, 0.18353870423743013, -0.19066653731436745, -0.09423224997242206, 0.14596847781388494, -0.09747986927777111, 0.16041150122587072, -0.02296513951256738, 0.06786878373578588, 0.15296635978447756]], 0)

To not get too many entries let’s just look at the first 20 iterations again

R4 = gradient_descent(
    M,
    f4,
    grad_f,
    data[1];
    record=[RecordCount(),],
    return_state=true,
)
# Solver state for `Manopt.jl`s Gradient Descent
After 60 iterations

## Parameters
* retraction method: ExponentialRetraction()

## Stepsize
ArmijoLinesearch() with keyword parameters
  * initial_stepsize    = 1.0
  * retraction_method   = ExponentialRetraction()
  * contraction_factor  = 0.95
  * sufficient_decrease = 0.1

## Stopping criterion

Stop When _one_ of the following are fulfilled:
    Max Iteration 200:  not reached
    |grad f| < 1.0e-8: reached
Overall: reached
This indicates convergence: Yes

## Record
(Iteration = RecordCount([25, 29, 33, 37, 40, 44, 48, 52, 56, 60, 64, 68, 72, 76, 80, 84, 88, 92, 96, 100, 104, 108, 112, 116, 120, 124, 128, 132, 136, 140, 144, 148, 152, 156, 160, 164, 168, 172, 176, 180, 184, 188, 192, 196, 200, 204, 208, 212, 216, 220, 224, 229, 232, 236, 240, 242, 246, 248, 254, 256]),)
get_record(R4)
60-element Vector{Int64}:
  25
  29
  33
  37
  40
  44
  48
  52
  56
  60
  64
  68
  72
   ⋮
 216
 220
 224
 229
 232
 236
 240
 242
 246
 248
 254
 256

We can see that the number of cost function calls varies, depending on how many line search backtrack steps were required to obtain a good stepsize.