Quantcast
Channel: Active questions tagged real-analysis - Mathematics Stack Exchange
Viewing all articles
Browse latest Browse all 9703

Can the rate or order of convergence of Gradient Descent change when we select a different stopping condition?

$
0
0

For example if the algorithm needs $O(1/\varepsilon)$ iterations to get to a point where $\|\nabla f(x)\| \leq \varepsilon$ can this Big-O complexity change when we instead use $\|x_{k+1} - x_{k}\| \leq \varepsilon$ (with variable gradient descent step sizes) as the stopping condition or the difference between successive function values $f(x_{k})$ and $f(x_{k+1})$ (where both $f(x)$ and $\nabla f(x)$ are Lipschitz continuous)?

EDIT: As I have not thought this through - of course we could always construct some steps size schedules that makes the complexity worse but if we use exact line search at each step, for example, or, let's say, some inexact search based on Armijo condition - could the choice of terminating condition change the convergence rate?


Viewing all articles
Browse latest Browse all 9703

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>