How to write matrix form for an eigenvalue problem

42 Views Asked by At

I have an eigenvalue problem to solve; for the coupled linear equations

$\lambda A_n=-\alpha_nA_n+A_{n+1}+A_{n-1}-\beta_nB_n$

$\lambda B_n=+\alpha_nB_n-B_{n+1}-B_{n-1}+\beta^\ast_nA_n$

I want to transform it into the standard form like $X~\vec{p}=\lambda~\vec{p}$; with $X$ being the matrix and $\vec{p}$ being the column vector $\begin{bmatrix} A_{n} \\ B_{n} \\ \end{bmatrix}$ . My question is how to make a valid $X$?. I tried but if my matrix $X$ is to include all entries e.g., $A_{n\pm1}$ and $B_{n\pm1}$, then $\vec{p}$ gets modified. For simplicity, I took $n=1,2$ only. Is there a valid workaround in place? Thanks.

1

There are 1 best solutions below

0
On BEST ANSWER

The discussion revealed that it is a system of, say, $2m$ equations, with $n$ running the values of $1,\ldots,m$, and with $A_0=B_0=A_{m+1}=B_{m+1}=0$. If we choose $(A_1,\ldots,A_m,B_1,\ldots,B_m)$ as a representation of the vector space of solutions, then the matrix is $\begin{bmatrix}-A & B \\ B^\star & A\end{bmatrix}$, where $$A = \begin{bmatrix}\alpha_1 & -1 & 0 & \cdots & 0 & 0 \\ -1 & \alpha_2 & -1 & \cdots & 0 & 0 \\ 0 & -1 & \alpha_3 & \cdots & 0 & 0 \\ & \cdots & & \ddots & \vdots & \vdots \\ 0 & 0 & 0 & \cdots & \alpha_{n-1} & -1 \\ 0 & 0 & 0 & \cdots & -1 & \alpha_n\end{bmatrix},\qquad B = \mathrm{diag}\{\beta_1,\ldots,\beta_n\}.$$ (Update: this can easily be adapted to $-N_p\leq n\leq N+N_p$ appeared in comments to the OP.)