On the in-comparability of the Gauss-Seidel and Jacobi iteration schemes.

121 Views Asked by At

Given the system

$x_1 + x_2 = 2$

$-x_1 + x_2=0$

$x_1 + 2x_2 - 3x_3=0$

the Jacobi iteration converges and Gauss-Seidel iteration diverges. Is there a way we can derive these two facts using the fact that the system $x = Cx + B$ of $n$ linear equations in $n$ unknowns converges if the matrix $C$ has a spectral radius of less than $1$? Thanks for your help.

1

There are 1 best solutions below

0
On

Are you sure the Jacobi method converges(for every starting vector)? Examining the error propagation matrix $$C= I - D^{-1} A = \begin{pmatrix} 1 &0 & 0\\ 0&1&0\\0&0&1\end{pmatrix} - \begin{pmatrix} 1& 1 & 0\\ -1 &1&0\\ 1/3 & 2/3 & 1 \end{pmatrix} = \begin{pmatrix} 0 & -1 & 0 \\ 1 & 0 & 0\\ 1/3 & 2/3 &0 \end{pmatrix} $$ with eigenvalues $0, \pm i$. Thus, the spectral radius $\rho(C)$ is $1$ and the method is not guaranteed to converge for every starting vector.

I ran also a few experiments in Matlab and did not observe convergence. Here is the code:

clear;
clc;
close all;

Rand_Range = 10;

b = Rand_Range * rand(3, 1);
A = [1, 1, 0; -1, 1, 0; 1, 2, -3];

disp(A\b); % True solution

k = 0;
k_max = 10000;
omega = 1; % For damped Jacobi

D_inv = diag(1./diag(A) );
D_inv_scaled = omega * D_inv;
Update_Mat = eye(3) - D_inv_scaled * A;

x_end = Rand_Range * rand(3, 1);
while k > k_max
  x_end = Update_Mat * x_end + D_inv_scaled * b;
  k=k+1;
end
disp(x_end);