It is useful to express convergence results as complexity bounds, i.e., given a precision , provide a bound on the number of iterations needed to reach that bound, in the worst case scenario.
For Newton’s method, show that the number of iterations that guarantees is such that
where are constants to be determined.
(Hint): use the fact that the condition that must be sufficiently close from can be formulated as .
We conclude this chapter with a few important takeaways.
convergence results can be stated in terms of objective values or in terms of convergence of iterates (sometimes, both)
convergence rates correspond to worst-case scenarios: they do not tell about the exact rate of algorithms for a given problem
this means that gradient descent, under the right assumptions, converges at least with a linear rate
this means that Newton’s method, under the right assumptions, converges at least with a quadratic rate \item Statements on number of iterations such as
are “at most” statements.