For example there is the Jacobi method to the find a solution for $Ax=b$ iteratively. Using a residuum vector we get:
$x^{(k+1)}=-D^{-1}(L+U)x^{(k)}+D^{-1}b=$
$=D^{-1}((D-A)x^{(k)}+b)=x^{(k)}+D^{-1}(-Ax^{(k)}+b)=$ $x^{(k)}+D^{-1}r^{k}$ where $A=(L+D+U)$
Lets say $s^{(k)}:= D^{-1}r^{(k)}$
So one step :
$x^{(k+1)}=x^{(k)}+s^{(k)}$ and
$r^{(k+1)}=r^{(k)}-As^{(k)}$
My question is that, what is the point using this method? Does it converge faster? Does it give a more precise approximation, or a smaller error?
Faster than what other methods? It's obviously an iterative algorithm so it 's faster than direct methods that typically require $\mathcal{O}(n^{3})$