I am a programmer and I am not good at math, so I came here to ask for advice.
I need to solve a linear equation $Ax = b$. I am able to use Gauss-Seidel iteration and Jacobian iteration
However, these two iterative methods need to determine whether they converge, otherwise they will waste time.
I found the judgment methods on wiki, but they are quite complicated. (1:A is symmetric positive-definite, 2: A is strictly or irreducibly diagonally dominant. 3:Calculate spectral radius)
I want to be able to quickly and easily determine if an equation is divergent. So I came up with this:
if $err^i > err^{i-1}$ then judged as diverging
$err = A * x^i - b$; $i$ is the number of iterations.
My idea is: if the error of this iteration is larger than the error of the previous time, it will be judged as divergence.
I'm not very good at math, so I'd like to ask everyone if this method is feasible.Thx
No.
This method wouldn't work.
err here is a vector. Perhaps you would like to measure its norm.
Even if we measure it's norm, you might have rule out a converging sequence. For example, consider the sequence $\frac1{2^n} + \frac{(-1)^n}{n}$. The first term is $\frac12$, followed by $\frac34$. Your method would have ruled it out but the sequence eventually converges to $0$.