Problem of existing matrices

64 Views Asked by At

Let $n,k \geq 2$ be two integers. Prove that there must exist some $n \times n$ invertible, non-diagonal matrices: $X_1,X_2, \dots , X_k$ with real entries, such that $$X_1^{-1}+X_2^{-1}+\dots+X_k^{-1}=(X_1+X_2+\dots+X_k)^{-1}$$

Since this is a problem which asks to construct the matrices, I tried to make it less general. I tried to find some matrices with $X_i^2=I_n$ and hence $X_i^{-1}=X_i$, so what would be left to take care of is the equality $(X_1+X_2+\dots+X_k)^2=I_n.$

For the case when $k$ is odd, I chose $X_1$ to be the matrix with $1$ on the antidiagonal and $0$ anywhere else; then for $X_i=(-1)^iX_1,\: i \geq 2$ the problem is solved.

However, I have problems when $k$ is even. I couldn't find such matrices even when $k=2$.

3

There are 3 best solutions below

6
On BEST ANSWER

For $k=2$ we could take

$$ X_1=\begin{pmatrix} 0 & 1\cr 1 & 0 \end{pmatrix},\; X_2=\begin{pmatrix} \alpha & -\beta -1\cr \beta & -\alpha \end{pmatrix}.\; $$ Then we have $$X_1^{-1}+X_2^{-1}=(X_1+X_2)^{-1}$$ and $X_1^2=X_2^2=I$ if and only if $$ \alpha^2=\beta^2+\beta+1. $$ A special case is $\beta=1$, with $\alpha^2=3$.

0
On

Here's a method that uses complex numbers. Let us note the association \begin{align} a+ib \longleftrightarrow \begin{pmatrix} a & -b\\ b & a \end{pmatrix}. \end{align} Hence it suffices to consider solutions to \begin{align} \frac{1}{z_1+z_2} = \frac{1}{z_1} + \frac{1}{z_2} \implies -1 = \frac{z_1}{z_2} + \frac{z_2}{z_1} = z + \frac{1}{z}. \end{align} In particular, let us focus on solutions lying on the unit circle. Hence we get \begin{align} -1 = e^{i\theta} + e^{-i\theta} = 2\cos \theta \end{align} which means $\theta = \frac{2\pi}{3}$ is a solution, that is, \begin{align} \frac{z_1}{z_2} = e^{i\frac{2\pi}{3}} = -\frac{1}{2}+i\frac{\sqrt{3}}{2}. \end{align} Let us choose $z_1 = i$ and $z_2 = \frac{\sqrt{3}}{2}-\frac{i}{2}$, then the corresponding matrices \begin{align} X_1 = \begin{pmatrix} 0 & -1\\ 1 & 0 \end{pmatrix} \ \ \ \text{ and } \ \ X_2 = \begin{pmatrix} \frac{\sqrt{3}}{2} & \frac{1}{2}\\ -\frac{1}{2} & \frac{\sqrt{3}}{2} \end{pmatrix} \end{align} satisfy the relation \begin{align} (X_1+X_2)^{-1} = X_1^{-1} + X_2^{-1}. \end{align}

0
On

How about assuming $X_i = Q \Lambda_i Q^T$ for some unitary matrix $Q$? If $\Lambda_i$ is not a scaled identity, these matrices are non-diagonal. Moreover, you have $X_1 + ... + X_k = Q(\Lambda_1 + ... + \Lambda_k)Q^T$ and $X_i^{-1} = Q \Lambda_i^{-1} Q^T$. This reduces your problem back to diagonal matrices which means you only need to find scalars $\lambda_{i,j}$ such that $\sum_{i=1}^k \frac{1}{\lambda_{i,j}} = \frac{1}{\sum_{i=1}^k \lambda_{i,j}}$ for all $j= 1, ..., n$. You could try to work from there. Just an idea.