Given is a symmetric tridiagonal $n\times n$ matrix $A$ over $\mathbb{R}$ of the following form: $A=\begin{Bmatrix} 0 & a_{1} & & & \\ a_{1} & 0 &a_{2} & & \\ & a_{2} & 0 &\ddots & \\ & &\ddots &\ddots & a_{n-1} \\ & & & a_{n-1} & 0 \end{Bmatrix}$
I am struggling with showing that if $\lambda$ is an eigenvalue, that $-\lambda$ is an eigenvalue too.
Well, first i tried to see how the characteristic polynomial looks like for the cases $n=2,3$. Then i saw in Wikipedia that there exists a recursive formula for the determinant, namely for computing the determinant $\begin{vmatrix} -\lambda &a_{1} & & & \\ a_{1} &-\lambda &a_{2} & & \\ &a_{2} &-\lambda & \ddots & \\ & &\ddots &\ddots & a_{n-1} \\ & & &a_{n-1} &-\lambda \end{vmatrix}$ it will be $\Delta _{n}=(-\lambda)\Delta _{n-1} - (a_{n-2})^{2}\Delta_{n-2}$.
Unfortunately, i am stuck in the last step of the induction... Can anybody help me, please? Is this the correct approach? Thank you in advance!
Why don't you try to use induction to show that for a $(n\times n)$ matrix $A_n$, the eigenvalue must satisfy: $$ \lambda^2 - \sum_{i=2}^n a_i^2 = 0$$ This will automatically show that $-\lambda$ is also an eigenvalue.