I have a presentation in less than a day and my brain seems to be stuck.
I'm trying to see what is the rough convergence rate of my algorithm on the Rosenbrock function and the data don't make much sense to me.
I measure the number of steps required to reach each $\varepsilon$ accuracy value - rerunning the algorithm from the last termination point - gradually decreasing $\varepsilon$. I also record all the step coordinates from the last run (with smallest target accuracy) - I discard the steps when the max number of steps is reached.
Plotting the number of steps against $ 1/\varepsilon $ is the target $\varepsilon$ produces a very fast growing curve, setting the x-scale to log gives a straight line (up to the point where max number of steps is reached where it's flat)- so far so good seems like a linear rate.
Now to confirm, I use the step coordinates collected durn the last run - I plot step number against the corresponding distances from the actual minimum (known to be [1,1]). Again rapidly increasing function - set x scale to log - again straight line (x scale reversed).
That is there is a constant increase of the number of iterations required to get a extra "digit of accuracy".
But the I plot the ratio of the distances of successive steps from the minimum and get this:
Which seems to show a sublinear rate - the differences are nearly all 1.Sorry I'm sure this is all very confused now.
https://link.springer.com/book/10.1007/978-0-387-40065-5https://www.cs.ubc.ca/~schmidtm/Courses/540-W18/L5.pdfhttps://www.princeton.edu/~aaa/Public/Teaching/ORF363_COS323/F23/ORF363_COS323_F23_Lec8.pdf