Numerical Optimization: Difference between revisions

no edit summary
No edit summary
Line 2: Line 2:




 
==Convergence Rates==
[https://en.wikipedia.org/wiki/Rate_of_convergence Wikipedia]<br>
The rate of convergence is <math>\lim_{k \rightarrow \infty} \frac{|x_{k+1}-x^*|}{|x_{k}-x^*|^q}=L</math><br>
Iterative methods have the following convergence rates:
* If Q=1 and L=1 we have sublinear.
* If Q=1 and <math>L\in(0,1)</math> we have linear convergence.
* If Q=1 and <math>L=0</math> we have superlinear convergence.
* If Q=2 and <math>L \leq \infty</math> we have quadratic convergence.
==Line Search Methods==
==Line Search Methods==
Basic idea:
Basic idea: