How to find matrices $A$ and $B$ such that for all real $t$, the eigenvalues of matrix $A+tB$ are $\pm \sqrt{t}$?

408 Views Asked by At

I tried to proceed as following:

$Av = av$ ($a =$ eigenvalue of matrix $A$).

$Bv = bv$ ($b =$ eigenvalue of matrix $B$).

Then, by definition, $(A+tB)x= (±\sqrt{t})x = (a+tb)x$.

I don’t know how to proceed further. Any help?

1

There are 1 best solutions below

6
On

I suppose you are considering $2 \times 2$ matrices.

There are infinitely many matrices with the property you are considering: it can follow from brute-force computations.

Let $A = \begin{pmatrix}a & b \\ c & d \end{pmatrix}$ and $B = \begin{pmatrix}e & f \\ g & h \end{pmatrix}$. Let $I = \begin{pmatrix}1 & 0 \\ 0 & 1 \end{pmatrix}$ the identity matrix. Suppose $ t \gt 0$ (as you are taking square roots), so we can change variable $t = u^2$.

Let $M_u = A + u^2 B$.

You want that $M_u$ has $u$ and $-u$ as eigenvalues, i.e. by definition:

$\det(M_u + u I) = 0$ and $\det(M_u - u I) = 0$ for all $u$.

Let's write $M_u + u I = \begin{pmatrix}a + u^2 e + u & b + u^2 f \\ c+u^2 g & d + u^2h + u \end{pmatrix}$ and $M_u - u I = \begin{pmatrix}a + u^2 e - u & b + u^2 f \\ c+u^2 g & d + u^2h - u \end{pmatrix}$.

Let's expand the determinants of these matrices:

$\det(M_u + u I) = (eh-fg)u^4+(e+h)u^3+(ah-bg-cf+de+1)u^2+(a+d)u+(ad-bc)$

$\det(M_u - u I) = (eh-fg)u^4+(-e-h)u^3+(ah-bg-cf+de+1)u^2+(-a-d)u+(ad-bc)$

Such polynomials must be identically zero, i.e. the coefficients of the powers of $u$ must be zero. Putting the coefficient of these polynomials equal to zero, we get a non linear system:

$\begin{array} \\ eh-fg & = 0 \\ e+h & = 0 \\ ah-bg-cf+de+1 & = 0 \\ a+d & = 0 \\ ad-bc & = 0 \\ \end{array}$

Taking $a,c,e$ as non zero free parameters (with $-4ae+1\ge0$), we get the following solutions:

$b = -\dfrac{a^2}{c}$

$d = -a$

$g = \pm \dfrac{c\sqrt{-4ae+1}}{2a^2} + \dfrac{c(2ae-1)}{2a^2}$

$f = -\dfrac{e^2}{g}$

$h = -e$

i.e. the following matrices solve your problem ($k = ae, k \le \frac{1}{4}$):

$A = \begin{pmatrix}a & -\dfrac{a^2}{c} \\ c & -a \end{pmatrix}$

$B_+ = \begin{pmatrix}\dfrac{k}{a} & -\dfrac{2k^2}{c}\dfrac{1}{\sqrt{-4k+1} + (2k-1)} \\ \dfrac{c}{2a^2}(\sqrt{-4k+1}+(2k-1)) & -\dfrac{k}{a} \end{pmatrix}$

or $B_- = \begin{pmatrix}\dfrac{k}{a} & -\dfrac{2k^2}{c}\dfrac{1}{-\sqrt{-4k+1} + (2k-1)} \\ \dfrac{c}{2a^2}(-\sqrt{-4k+1}+(2k-1)) & -\dfrac{k}{a} \end{pmatrix}$

Note the $A, B_+, B_-$ are singular matrices.

For example $A = \begin{pmatrix}2 & -4 \\ 1 & -2 \end{pmatrix}$ and $B = \begin{pmatrix}-1 & 4 \\ -\frac{1}{4} & 1 \end{pmatrix}$ solve your problem.