Study the convergence of the Jacobi method applied to the matrix
$$ \begin{pmatrix}a^2 & 0 & 0\\0 & 0 & -a^2\\ 0 & -a^2 & 0\end{pmatrix}$$
depending on $a$.
Unfortunately, the matrix is not diagonally dominant, so I don't know which theorem I can apply. I was thinking to apply a permutation matrix, but I don't know how to change things accordingly then.
Any help is highly appreciated.
You can try to compute the matrix $M = -D^{-1}(L+U)$ which describes how the error changes, i.e. it holds $e_{k+1} = M e_k$. Here $D$ denotes the matrix given by the diagonal of $A$, $L$ is the lower triangular part of $A$ and $U$ is the upper triangular part. Can you say anything about the eigenvalues of $M$?
If the eigenvalues of $M$ are real and strictly smaller than one, the Jacobi method converges. The convergence rate is given by the largest eigenvalue.