Jacobi method applied to a matrix: discuss convergence depending on $a$

586 Views Asked by At

Study the convergence of the Jacobi method applied to the matrix

$$ \begin{pmatrix}a^2 & 0 & 0\\0 & 0 & -a^2\\ 0 & -a^2 & 0\end{pmatrix}$$

depending on $a$.

Unfortunately, the matrix is not diagonally dominant, so I don't know which theorem I can apply. I was thinking to apply a permutation matrix, but I don't know how to change things accordingly then.

Any help is highly appreciated.

2

There are 2 best solutions below

4
On

You can try to compute the matrix $M = -D^{-1}(L+U)$ which describes how the error changes, i.e. it holds $e_{k+1} = M e_k$. Here $D$ denotes the matrix given by the diagonal of $A$, $L$ is the lower triangular part of $A$ and $U$ is the upper triangular part. Can you say anything about the eigenvalues of $M$?

If the eigenvalues of $M$ are real and strictly smaller than one, the Jacobi method converges. The convergence rate is given by the largest eigenvalue.

2
On

Before analysing convergence, you should note that Jacobi's method can't even be directly applied with this matrix. You need to swap the second and third rows. After this, the matrix is diagonally dominant, unless $a=0$.