Numerical Optimization: Difference between revisions

No edit summary
Line 6: Line 6:
The rate of convergence is <math>\lim_{k \rightarrow \infty} \frac{|x_{k+1}-x^*|}{|x_{k}-x^*|^q}=L</math><br>
The rate of convergence is <math>\lim_{k \rightarrow \infty} \frac{|x_{k+1}-x^*|}{|x_{k}-x^*|^q}=L</math><br>
Iterative methods have the following convergence rates:
Iterative methods have the following convergence rates:
* If Q=1 and L=1 we have sublinear.  
* If Q=1 and L=1 we have sublinear convergence.  
* If Q=1 and <math>L\in(0,1)</math> we have linear convergence.
* If Q=1 and <math>L\in(0,1)</math> we have linear convergence.
* If Q=1 and <math>L=0</math> we have superlinear convergence.
* If Q=1 and <math>L=0</math> we have superlinear convergence.
* If Q=2 and <math>L \leq \infty</math> we have quadratic convergence.
* If Q=2 and <math>L \leq \infty</math> we have quadratic convergence.
==Line Search Methods==
==Line Search Methods==
Basic idea:
Basic idea: