Curious Case of Idempotent Matrices - Seeking a Generalisation

314 Views Asked by At

Here's what I initially started with:

Find a 2x2 non zero matrix $A$, satisfying $A^2=A$, and $A\neq I$.

I understand that this is fairly easy, but please keep reading for something interesting coming up -

Let's bash it. Assume a matrix A = $\begin{bmatrix}a&b \\c&d\end{bmatrix}$. Putting $A^2 = A$ gives me the following system to solve:

  • $bc = a(1-a)$
  • $bc = d(1-d)$
  • $b(a+d)=b$
  • $c(a+d)=c$

Some conclusions:

  1. If $a+d = 1$, then $bc = ad$. That is, if we assume a certain value for a, we have d, and choosing a value for b, gets us c (or the other way round). So, knowing one of the tuples $(a,b), (a,c), (d,b),$ or $(d,c)$ determines the matrix $A$. On the other hand, if we choose $b$ and $c$ to begin with, we know $a$ and $d$ from the obvious quadratic equations that follow. Knowing the tuple $(b,c)$ also determines the matrix.

  2. If $a+d\neq 1$, then $(b,c)$ must be (0,0) for the last two equations to hold. Next, we're left with $a^2=a$ and $d^2=d$, which means $(a,d)$ is $(1,1)$ (we reject $(0,1)$ and $(1,0)$ since that'd mean $a+d=1$, and also $(0,0)$ since it'd result in a null matrix). This means that, if $a+d\neq1$, then $a+d=2$ with $a=d=1$, and $(b,c)=(0,0)$. We seem to have no power here (can't choose variables the way we did in the previous case), as $a+d\neq1$ alone determines the entire matrix. Anyway, we shall ignore this for now, since we demand $A\neq I$

In conclusion, knowing one of the rows or columns determines the entire matrix. (the matrix not being null, or identity). Also, knowing the diagonal other than the main diagonal determines the entire matrix.

The observation here, is that in a $2$x$2$ matrix, which has $4$ entries, knowing any pair of entries other than the one along the main diagonal helps us determine other entries.

Why is it so? Could we have said this without going through such cumbersome algebraic weightlifting?

Does this generalise for $n$x$n$ idempotent matrices? That is, can we deduce something along the lines of:

  • Knowing any row or column determines the matrix OR
  • Knowing a certain minimum number of rows of columns (>1) determines the matrix OR
  • Knowing the diagonal other than the main diagonal determines the matrix OR

really anything along those lines. My gut came up with the above possibilities, if this apparently interesting pattern is to hold for matrices of higher order. I really feel there's something worth paying attention to, going on here.

I'd love it if y'all could share your thoughts on this, and help me identify a possible pattern. It'd be great to generalize this idea to higher order idempotent matrices, perhaps even others, if there's nothing special about $A^2=A$ here. I think this is a really important question, since it really boils down to, knowing a constraint in the form of a matrix, how many entries do I need to know, to determine the rest of the matrix? (uniquely determine, if that pleases you)

Hoping to find something amazing, wish y'all a great day!

3

There are 3 best solutions below

2
On BEST ANSWER

$A\in M_n(\mathbb{R})$ is a projector (eventually non-orthogonal). The projectors are classified by their trace. Assume that $rank(A)=trace(A)=r\in (0,n)$.

$A$ is associated to a (unique) decomposition $\mathbb{R}^n=E\oplus F$ where $dim(E)=r,dim(F)=n-r$. The couple $(E,F)$ -and then $A$- depends on $r(n-r)+(n-r)r=2r(n-r)$ algebraically independent parameters.

Finally, if you choose skillfully $2r(n-r)$ entries of $A$, then there are only a finite number of possible values for the projector $A$.

In particular, if $r=1$ or $r=n-1$ (projection on a line or on a hyperplane), then it suffices to fix $2n-2$ entries of $A$ (but not just any).

1
On

The observation here, is that in a $2$x$2$ matrix, which has $4$ entries, knowing any pair of entries other than the one along the main diagonal helps us determine other entries.

This is not true. E.g. we have $$ A=\pmatrix{1&x\\ 0&0}=\pmatrix{1&x\\ 0&0}^2 $$ for every $x$. Here you are given the first column and the second row of $A$, but you cannot determine $x$ without further information.

0
On

Let $\{v_1,v_2,\dots,v_n\}$ any basis of $\mathbb{R}^n$ (or the field of your choice). Fix $k$ with $1\le k<n$ (to avoid trivial cases) and define the linear map $f\colon\mathbb{R}^n\to\mathbb{R}^n$ by decreeing that $$ f(v_i)=\begin{cases} v_i & 1\le i\le k \\[1ex] 0 & k<i\le n \end{cases} \tag{1} $$ Then clearly $f(f(v_i))=f(v_i)$ for $1\le i\le n$, so the map $f$ is idempotent, that is, $f^2=f$. Its matrix $A$ with respect to the standard basis will be idempotent as well.

Conversely, let $A$ be an idempotent matrix (not the zero matrix or the identity matrix); note that $A(I-A)=0$, so the matrix is not invertible. Let $X=\{v\in\mathbb{R}^n:Av=v\}$ and $Y=\{v\in\mathbb{R}^n:Av=0\}$.

Clearly, $X\cap Y=\{0\}$. Moreover if $v\in\mathbb{R}^n$, we have $$ v=Av+(v-Av) $$ Note that $A(Av)=A^2v=Av$, so $x=Av\in X$; also $A(v-Av)=Av-A^2v=Av-Av=0$, so $y=v-Av\in Y$. Therefore, from $v=x+y$, $x\in X$ and $y\in Y$, we conclude that $\mathbb{R}^n=X+Y$.

By independence of the two subspace we get that $n=\dim X+\dim Y$. If you fix a basis $\{v_1,\dots,v_k\}$ of $X$ and a basis $\{v_{k+1},\dots,v_n\}$ of $Y$, then $\{v_1,\dots,v_n\}$ is a basis satisfying the same conditions as in $(1)$.


In the case $n=2$, the matrix $A$ has rank $1$, so a nonzero column (or row) “determines” the other column, in the sense that the other one must be a scalar multiple.

For instance, if the first column is nonzero, we need $$ A=\begin{bmatrix} a & ra \\ c & rc \end{bmatrix} $$ and the conditions $A^2=A$ reads, when $a\ne0$ and $c\ne0$, $$ r=\frac{1-a}{c} $$ If $a=0$, then $c\ne0$ and the form of the matrix is $$ \begin{bmatrix} 0 & 0 \\ c & 1 \end{bmatrix} $$ If $c=0$, then $a\ne0$, but this implies $a=1$, and the form of the matrix is $$ \begin{bmatrix} 1 & r \\ 0 & 0 \end{bmatrix} $$