Eigenvalues gone wild

270 Views Asked by At

I added some significant details to this problem, as it was apparently not clear to everyone what I want to know:

This is a question about convergence of eigenvalues which essentially came up in studying the spectrum of St.-Liouville operators. We want to look at matrices that agree in most of their entries and want to investigate whether this implies convergence of the eigenvalues.

We start with two matrices $$ A_1:=\left[ \begin {array}{cc} 3.5&- 0.5\\ - 0.5& 0.75 \end {array} \right] $$ with eigenvalues $$\lambda_{1,1} := 0.661912511160047 \quad \lambda_{1,2}:=3.58808748883995 $$ and the matrix $$ B_1:=\left[ \begin {array}{ccc} 3.5&- 0.5&-1/4\,\sqrt {2} \\ - 0.5& 0.25&-1/2\,\sqrt {2}\\ - 1/4\,\sqrt {2}&-1/2\,\sqrt {2}&- 0.5\end {array} \right] $$ with eigenvalues $$\mu_{1,0}:=-0.9958877876 \quad \mu_{1,1}:= 0.6554756723 \quad \mu_{1,2}:=3.590412115.$$

We observe that $\lambda_{1,1} \approx \mu_{1,1} $ and $\lambda_{1,2} \approx \mu_{1,2}$.

Now, we extend our matrices to larger dimensions, denoting them as $A_{i},B_{i}$ ,in the following way: So we get $A_i$ from $A_1$ by doing the following:

(i) we use $A_1$ as the $A_i[n-1:n,n-1:n]$ submatrix of $A_i$. The elements down the diagonal are found from bottom to top by successive iterations in steps of two: So $A_i(n-2,n-2) = A_i(n-1,n-1) + 5$,

$A_i(n-3,n-3) = A_i(n-2,n-2) + 7$

$A_i(n-4,n-4) = A_i(n-3,n-3) + 9$ and so on.

(ii)Down the first subdiagonal all entries are $-0.5$ and

(iii)down the subsubdiagonal all entries are $-0.25$.

All other entries are zero!

For the $B_i$ we use the same extension, but use the different basis matrix

$$ B_i[n-2:n,n-2:n]:=\left[ \begin {array}{ccc} 3.5&- 0.5&-1/4\,\sqrt {2} \\ - 0.5& 0.25&-1/2\,\sqrt {2}\\ - 1/4\,\sqrt {2}&-1/2\,\sqrt {2}&- 0.5\end {array} \right] . $$

Notice that due to the fact that we use THE SAME iterative scheme to define $A_i$ and $B_i$ we get very similar matrices $A_i,B_i$. So we get for example

$$A_4:=\left[ \begin {array}{ccccc} 24.5&- 0.5&- 0.25&0&0 \\ - 0.5& 15.5&- 0.5&- 0.25&0\\ - 0.25&- 0.5& 8.5&- 0.5&- 0.25\\ 0&- 0.25&- 0.5& 3.5& - 0.5\\ 0&0&- 0.25&- 0.5& 0.75\end {array} \right] $$

with eigenvalues $$\lambda_{4,5 }:= 24.5307920815531 \quad \lambda_{4,4}:= 15.5136493593423 \quad \lambda_{4,3}:= 8.51760322347614 \quad \lambda_{4,2}:=3.54058988050425 \quad\lambda_{4,1}:=0.647365455124154$$

and

$$ B_4:= \left[ \begin {array}{cccccc} 24.5&- 0.5&- 0.25&0&0&0 \\- 0.5& 15.5&- 0.5&- 0.25&0&0\\ - 0.25&- 0.5& 8.5&- 0.5&- 0.25&0\\ 0&- 0.25&- 0.5& 3.5&- 0.5&-1/4\,\sqrt {2}\\ 0&0&- 0.25&- 0.5& 0.25& -1/2\,\sqrt {2}\\ 0&0&0&-1/4\,\sqrt {2}&-1/2\,\sqrt {2}&- 0.5\end {array} \right] $$

This matrix has the eigenvalues $$\mu_{4,1} = 0.6473654185 \quad \mu_{4,2} =3.540589910 \quad \mu_{4,3} =8.517603211 \quad \mu_{4,4} =15.51364936 \quad \mu_{4,5} =24.53079208,\mu_{4,0}=-0.9999999836$$

Obviously, the eigenvalues of $A_4$ and $B_4$ are extremely close together. Though, $B_4$ has an additional eigenvalue $\mu_{4,0}$ without a partner in the spectrum of $A_4$.


So what I want to do is the following:

By the iterative definition of these matrices we get sequences $(A_i)_i$ and $(B_i)_i$ with eigenvalue sequences $(\lambda_{i,1})_{i \ge 1}$,$(\lambda_{i,2})_{i \ge 1}$,$(\lambda_{i,3})_{i \ge 2}$,$(\lambda_{i,4})_{i \ge 3}$ and so on and eigenvalues $(\mu_{i,0})_{i \ge 1}$,$(\mu_{i,1})_{i \ge 1}$,$(\mu_{i,2})_{i \ge 1}$,$(\mu_{i,3})_{i \ge 2}$ and $(\mu_{i,4})_{i \ge 3}$.. .

I want to show that $\mu_{i,0} \rightarrow -1$ and all the other eigenvalues converge to their partner value, so $\lambda_{i,k} \rightarrow c_k \in \mathbb{R}$(for i approaching infinity, hence going over to larger extended matrices) and accordingly $\mu_{i,k} \rightarrow c_k.$

Numerical simulations actually suggest that this happens ( I calculated up to $A_{150}$ and $B_{150}$ where I reached a pretty good convergence to the values already strongly suggested by $A_4$ and $B_4$, but I am not able to show it.) I will award a 300 points bounty to the person answering this question :-).

Hope my problem is clearer now!

1

There are 1 best solutions below

0
On BEST ANSWER

This probably doesn't answer the question but it might give a hint where to look for a solution. Suppose you extend a matrix $A$ by setting $$ \widetilde A :=\begin{pmatrix}a&0\\0 & A\end{pmatrix}, $$ where $a$ is any real number. Of course the set of eigenvalues of $\widetilde A$ consists of $a$ plus the set of eigenvalues of $A$. So my guess is that what you observe is not really related to your choice of $a$ but more on the size of your off-diagonal elements. You have something like $$ \begin{pmatrix}a&\varepsilon\\\delta & A\end{pmatrix}, $$ where $\varepsilon^T$ is a vector with comparatively small entries and so is $\delta$. As $\varepsilon,\delta\to 0$ you will get perfect agreement and since the eigenvalues depend continuously on the entries I think you'll get an answer in that direction. But maybe I got the whole thing wrong, then I can delete that answer of course.