What field do entries of eigenvectors belong in?

144 Views Asked by At

I have the following problem:

"Given the matrix $A = \begin{pmatrix} 1&i\\ i&1 \end{pmatrix}$, find the eigenspaces of the respective eigenvalues".

First I found the eigenvalues to be $\lambda_1 = 1+i$ and $\lambda_2 = 1-i$. Then, using that the eigenspace of an eigenvalue $\lambda$ is $E_{\lambda}=\text{Ker} (A -\lambda I)$ I found that if $(x,y) \in E_{\lambda_1}$ then $x=y$, and similarly if $(x,y) \in E_{\lambda_2}$ then $x=-y$.

My question arose when I wanted to write where the $x$ and $y$ entries of the eigenvector live in. WolframAlpha defaults to real entries for the example eigenvectors (gives $v_1 = (1,1)$ and $v_2 = (1,-1)$), but can the entries be complex too? And if so, is there any type of matrix in which the entries can only be real? Thank you!

2

There are 2 best solutions below

0
On BEST ANSWER

Yes, the eigenvectors can also have complex coefficients. For instance, since $(1,1)$ is an eigenvector, then so is $(i,i)$, since it is equal to $i(1,1)$.

0
On

Generally, for an $n \times n$ matrix $A$ with entries in the field $k$ (for instance, the real numbers $\mathbb R$ or the complex numbers $\mathbb C$), the entries of the eigenvectors of $A$ will lie in the field $k.$ (By definition, an eigenvector of $A$ is an $n \times 1$ vector $v$ with entries in $k$ such that $Av = v.$)

Quite strictly speaking, it is best practice to work in a field $k$ that is algebraically closed, i.e., a field over which all polynomials have roots. ($\mathbb C$ is perhaps the most ubiquitous example of an algebraically closed field.) Explicitly, if $k$ is algebraically closed, then any $n \times n$ matrix over $k$ will have eigenvalues (and therefore some eigenvectors). By definition, we determine the eigenvalues of $A$ by computing the roots of the degree $n$ polynomial $p(x) = \det(A - xI),$ hence if $k$ is algebraically closed, then $p(x) = (x - \lambda_1)^{e_1} \cdots (x - \lambda_k)^{e_k}$ for some integers $e_i \geq 0$ such that $e_1 + \cdots + e_k = n.$ (Of course, the $\lambda_i$ are the desired eigenvalues.)

We claim that the following matrix (with entries viewed as elements of $\mathbb R$) has no eigenvalues. $$A = \begin{pmatrix} \phantom{-} 0 & 1 \\ -1 & 0 \end{pmatrix}$$ Explicitly, we have that $p(x) = \det(A - xI) = x^2 + 1.$ We know that the roots of $p(x)$ are $\pm i,$ hence $p(x)$ has no real roots. Considering that $A$ is a real matrix, it has no real eigenvalues.

Last, to your final question, every symmetric $n \times n$ matrix over $k = \mathbb R$ has real eigenvalues. Even better, such a matrix is orthogonally diagonalizable. (If you are curious, by all means, Google that.)

Proof. Consider a symmetric $n \times n$ real matrix $A.$ Every real number is complex (with $0$ imaginary part), hence we can view $A$ as a complex matrix. Consequently, there exists an eigenvalue $\lambda$ of $A$ and a complex vector $v$ such that $Av = \lambda v$ (because $\mathbb C$ is algebraically closed). Observe that $$\bar v^t Av = \bar v^t(Av) = \bar v^t(\lambda v) = \lambda \bar v^t v,$$ where $\bar v$ denotes the vector whose entries are the complex conjugates of the entries of $v$ and $\bar v^t$ is the usual transpose of the vector $\bar v.$ By hypothesis that $A$ is symmetric and real, we have that $A^t = A = \bar A = \bar A^t,$ from which it follows that $$\bar v^t A v = \bar v^t \bar A^t v = \overline{v^t A^t} v = \overline{(Av)^t} v = \overline{(\lambda v)^t} v = \bar \lambda \bar v^t v.$$ Ultimately, the right-hand sides of both displayed equations are equal, hence $\lambda = \bar \lambda$ is real. QED.