Real matrices with non-real eigenvalues

2.2k Views Asked by At

enter image description here

I know this covers a lot, so perhaps someone could redirect me to a helpful website.

for a) I have no idea where to start on the proof, as I don't understand why this is true.

for b) I also have no idea.

for c) A is a real matrix, so if one eigenvalue is complex, the others are too? (the conjugate of the first)?

3

There are 3 best solutions below

5
On BEST ANSWER

Hint for a): What does it mean for $z$ to be an eigenvector? It means that $Az=\lambda z$. When would you say that two complex quantities are equal? You equate some parts. What, with what?

Hint for b) Assume they are linearly dependent and $y=kx$ for some $k$. Then use (a) and the fact that $b \neq 0$.

4
On

Hints have already been given for (a), (b), so I'll address (c). There are definitely matrices with both real and non-real eigenvalues: as a trivial example, let $A$ have non-real eigenvalues, and be $n \times n$ and consider the block matrix:

$$B = \begin{pmatrix} k& 0 \\ 0 & A \end{pmatrix}$$

where $k$ is real. This adds an eigenvector of the form $(1,0,...,0)^T$, with real eigenvalue $k$. On the other hand, for all eigenvectors $v$ of $A$, the vector $(0, v^T)^T$ is an eigenvector of $B$ with the same eigenvalue. However, you are right to note that if a real matrix has a non-real eigenvalue, then the conjugate is also an eigenvalue. This is because the characteriztic polynomial of a real matrix has real coefficients, and real polynomials always have root sets symmetrix under conjugation.

0
On

We are given that $A$ is a real $n \times n$ matrix with a complex eigenvalue $\lambda = a + bi$, $b \ne 0$, and a (necessarily) complex vector $\vec z \in \Bbb C^n$ such that

$A \vec z = \lambda \vec z. \tag{1}$

Writing

$\vec z = \vec x + i\vec y, \tag{2}$

with $\vec x, \vec y \in \Bbb R^n$, and inserting it into (1) yields

$A(\vec x + i\vec y) = \lambda(\vec x + i\vec y) = (a + ib)(\vec x + i\vec y). \tag{3}$

(3) may be written as

$A \vec x + iA \vec y = (a\vec x - b\vec y) + i(a \vec y + b \vec x), \tag{4}$

or, writing the real and imaginary parts out separately, we find

$A\vec x = a\vec x - b\vec y, \tag{5}$

$A\vec y = a\vec y + b\vec x. \tag{6}$

It should be remembered that though $A$ is a real matrix it must be construed as acting on the complex vector space $\Bbb C^n$; otherwise the assertion that $A$ has a complex eigenvalue makes no sense. Thus we consider $A$ as as acting on $\Bbb C^n$; as such, it is a $\Bbb C$-linear map, that is, $A(c \vec z) = cA \vec z$ for any $\vec z \in \Bbb C^n$ and $c \in \Bbb C$; thus we justify writing $A(\vec x + i \vec y) =A\vec x + iA\vec y$ which was tacitly invoked in the transiton 'twixt (3) and (4). In any event, the preceding establishes item (1).

We observe that $b \ne 0$ implies $\vec x \ne 0 \ne \vec y$; for if $\vec x = 0$, then by (5), $b\vec y = 0$ and hence, since $b \ne 0$, $\vec y = 0$. Likewise if $\vec y = 0$, (6) forces $\vec x = 0$; this shows either both $\vec x$, $\vec y$ are zero or neither are; but both cannot be zero since the eigenvector $\vec z \ne 0$; thus $\vec x \ne 0 \ne \vec y$.

As for item (2), note that if $\vec x$ and $\vec y$ are linearly dependent over $\Bbb R$ (that is, as elements of $\Bbb R^n$), then there must exist nonzero $\alpha, \beta \in \Bbb R$ with

$\alpha \vec x + \beta \vec y = 0, \tag{7}$

or

$\vec y = -\dfrac{\alpha}{\beta} \vec x = k \vec x \tag{8}$

with $0 \ne k = -(\alpha/\beta) \in \Bbb R$. But then, from (3),

$(1 + ik)A \vec x = A(\vec x +i k \vec x) = A(\vec x + i \vec y) = (a + bi)(\vec x + i\vec y)$ $= (a + bi)(\vec x +i k\vec x) = (1 + ik)(a + bi)\vec x, \tag{9a}$

or, cancelling out $1 + ik$,

$A\vec x = (a + bi)\vec x. \tag{9b}$

Now the left hand side of (9b) is real, since both $A$ and $\vec x$ are; but the right is only real in the event $b = 0$; thus the hypothesis $b \ne 0$ prohibits (7) and hence $\vec x$, $\vec y$ are linearly independent, establishing item (2).

As for (3), suppose that $\text{span}\{\vec x, \vec y \}$ contained a real eigenvector $\vec w$; then we would have

$\vec w = \alpha \vec x + \beta \vec y \tag{10}$

with $\alpha, \beta \in \Bbb R$ not both zero. Furthermore $\vec w$ being a real eigenvector satisfies

$A\vec w = \mu \vec w \tag{11}$

with $\mu \in \Bbb R$ since $A$ and $\vec w$ are real. Using (10) in (11) we have

$A(\alpha \vec x + \beta \vec y) = \mu (\alpha \vec x + \beta \vec y), \tag{12}$

or by virtue of (5)-(6),

$\alpha A \vec x + \beta A \vec y = \alpha(a \vec x - b \vec y) + \beta(a \vec y + b \vec x) = \mu \alpha \vec x + \mu \beta \vec y, \tag{13}$

and since $\vec x$, $\vec y$ are both non-vanishing and linearly independent we can collect and equate their coefficients in (13):

$a \alpha + b \beta = \mu \alpha, \tag{14}$

$-b \alpha + a \beta = \mu \beta; \tag{15}$

we see that these two equations are in fact the eigen-equation for a $2 \times 2$ matrix:

$\begin{bmatrix} a & b \\ -b & a \end{bmatrix} \begin{pmatrix} \alpha \\ \beta \end{pmatrix} = \mu \begin{pmatrix} \alpha \\ \beta \end{pmatrix}, \tag{16}$

and recalling that $(\alpha, \beta)^T \ne 0$ we see that $\mu$ must be an eigenvalue of

$C = \begin{bmatrix} a & b \\ -b & a \end{bmatrix}; \tag{17}$

but it is easily seen that the eigenvalues of $C$ are $a \pm bi$, since the characteristic polynomial of $C$ is $x^2 -2a x + (a^2 + b^2)$; since $b \ne 0$, $\mu$ cannot be real; we have reached a contradiction from which it follows that $\text{span}\{\vec x, \vec y \}$ contains no real eigenvector $\vec w$; thus is item (3) established. QED.

Hope this helps. Cheers,

and as always,

Fiat Lux!!!