Finding the eigenvalues of a 9x9 matrix

256 Views Asked by At

Is there any way that I can find the eigenvalues of this matrix!? I've used SymPy and tried the M.eigenvals() function but it always runs out of time. My main goal is to diagonalise it so if anyone can help that's be great. Thanks!

$$ \begin{bmatrix} -e^{-ik_{1}} & e^{-ik_{1}} & e^{-ik_{1}} & e^{-ik_{1}} & 0 & 2\sqrt{3}ie^{ik_{1}} & 0 & 0 & 0 \\ e^{ik_{1}} & -e^{ik_{1}} & e^{ik_{1}} & e^{ik_{1}} & 0 & 0 & 2\sqrt{3}ie^{-ik_{1}} & 0 & 0 \\ e^{-ik_{2}} & e^{-ik_{2}} & -e^{-ik_{2}} & e^{-ik_{2}} & 0 & 0 & 0 & 2\sqrt{3}ie^{-ik_{2}} & 0 \\ e^{ik_{2}} & e^{ik_{2}} & e^{ik_{2}} & -e^{ik_{2}} & 0 & 0 & 0 & 0 & 2\sqrt{3}ie^{ik_{2}} \\ 0 & 0 & 0 & 0 & 4 & 0 & 0 & 0 & 0\\ 2\sqrt{3}ie^{i(k_{2}+k_{1})} & 0 & 0 & 0 & 0 & -e^{-i(k_{2}+k_{1})} & e^{-i(k_{2}+k_{1})} & e^{-i(k_{2}+k_{1})} & e^{-i(k_{2}+k_{1})}\\ 0 & 2\sqrt{3}ie^{i(k_{1}-k_{2})} & 0 & 0 & 0 & e^{i(k_{1}-k_{2})} & -e^{i(k_{1}-k_{2})} & e^{i(k_{1}-k_{2})} & e^{i(k_{1}-k_{2})}\\ 0 & 0 & 2\sqrt{3}ie^{i(k_{2}-k_{1})} & 0 & 0 & e^{i(k_{2}-k_{1})} & e^{i(k_{2}-k_{1})} & -e^{i(k_{2}-k_{1})} & e^{i(k_{2}+k_{1})}\\ 0 & 0 & 0 & 2\sqrt{3}ie^{i(k_{1}+k_{2})} & 0 & e^{i(k_{1}+k_{2})} & e^{i(k_{1}+k_{2})} & e^{i(k_{1}+k_{2})} & -e^{i(k_{1}+k_{2})}\\ \end{bmatrix}$$

Edit: For context, I'm trying to do a quantum walk on a D2Q9 lattice. I have done this sucessfully. Quantum walks use coin opertors and I want to transform my coin into fourier space. The above matrix is the coin transformed into fourier space, however I need the eigenvalues and eigenvectors of this coin to get a useable operator. An approximation may not scale well when using it as a coin operator in a quantum walk which is why I'm trying to find an exact solution and not an approximation.

2

There are 2 best solutions below

0
On BEST ANSWER

Let $z = e^{i k_1}$ and $w = e^{i k_2}$. Finding the eigenvalues of this $9 \times 9$ matrix is asking for roots of a $9$th degree polynomial where the coefficients involve various integer powers of $z$ and $w$. From there, finding the corresponding eigenvectors requires solving linear systems which can create even more mess.

Rootfinding for high-degree polynomials is easy if you're OK with a numerical solution, but getting exact symbolic answers is generally only possible with some case-specific insight into the structure of this particular polynomial. It can't hurt to try just plugging into computer algebra systems, but you should be open to the possibility that a) there is no "nice" solution at all, or b) you need to find some clever simplification yourself based on details of your specific problem, and only then give the computer an easier problem to solve.

I can confirm that Mathematica does not give useful outputs if I directly type your matrix and plug into the Eigensystem command. It does give a symbolic result, but the result is in terms of roots of a bunch of 9th degree polynomials, and Mathematica does not find any good expression for those roots.

I can also confirm that if I plug in specific values for $k_1$ and $k_2$ and use numerical methods then Mathematica gives answers no problem. I expect standard Python tools could do the same.

In summary, my advice is:

  1. See if your needs can be satisfied by a numerical approximation that's accurate to a bunch of decimal places. If so, use that.
  2. Otherwise, think hard about whatever problem is leading to this question. Maybe you can find more structure or a better way of thinking about it so that the resulting linear algebra problem is easier.
  3. Otherwise, there might just not be any nice solution. If you just write random expressions in a $9 \times 9$ matrix then unfortunately this is by far the most common result! On the other hand, you might have reasons for feeling confident that some nice solution is possible (e.g. if this problem came from an exercise in some textbook). If that's the case, you should edit your question so answerers have the same context you have.
0
On

Not an answer, but way too much for a comment. First, the matrix acts trivially (multiplication by four) on the fifth column. So we can take that $e_5$ as eigenvector with eigenvalue $\lambda=4$, and thus delete this row/column. If you let $\omega_1=e^{-ik_1},\omega_2 =e^{-i k_2}$, then the rest of the matrix becomes

$$ \begin{bmatrix} -\omega_1 & \omega_1 & \omega_1 & \omega_1 & 2\sqrt{3}i \omega_1^* & 0 & 0 & 0 \\ \omega_1^* & -\omega_1^* & \omega_1^* & \omega_1^* & 0 & 2\sqrt{3}i\omega_1 & 0 & 0 \\ \omega_2 & \omega_2 & -\omega_2 & \omega_2 & 0 & 0 & 2\sqrt{3}i \omega_2 & 0 \\ \omega_2^* & \omega_2^* & \omega_2^* & -\omega_2^* & 0 & 0 & 0 & 2\sqrt{3}i\omega_2^* \\ 2\sqrt{3}i \omega_1^* \omega_2^* & 0 & 0 & 0 & -\omega_1 \omega_2 & \omega_1 \omega_2 & \omega_1 \omega_2 & \omega_1 \omega_2\\ 0 & 2\sqrt{3}i\omega_1^* \omega_2 & 0 & 0 & \omega_1 \omega_2^* & -\omega_1^*\omega_2 & \omega_1^* \omega_2 & \omega_1\omega_2^*\\ 0 & 0 & 2\sqrt{3}i\omega_1 \omega_2^* & 0 & \omega_1 \omega_2^* & \omega_1\omega_2^* & -\omega_1 \omega_2^* & \omega_1^* \omega_2^*\\ 0 & 0 & 0 & 2\sqrt{3}i \omega_1^* \omega_2^*& \omega_1^* \omega_2^* & \omega_1^*\omega_2^* & \omega_1^*\omega_2^* & -\omega_1^*\omega_2^*\\ \end{bmatrix} $$ This is not all that much nicer, but I find it easier to read. Additionally, it makes the block structure more apparent and it's possible someone else may notice a nice way to refine this further. For instance, the upper-right and lower-left diagonal matrix can be written as $2\sqrt{3}i(D_1\otimes D_2^*)$ and $2\sqrt{3}i[(\omega_1^* D_2)\otimes (\omega_2^* D_1)]$ respectively, where $D_1 =\text{diag} (\omega_1^*,\omega_1)$, $D_2 =\text{diag} (\omega_2^*,\omega_2)$ and $\otimes$ is the Kronecker product. (What I would love is to express the remaining blocks in these terms, but this is not immediately apparent to me.)