$(I-A)^k=0 \text{ implies that } \exists A^{-1} \text{ s.t. }AA^{-1}=I $

185 Views Asked by At

I think this proposition is right. If this is not right, could you provide a counter example? However, this is definitely right for the $\mathbb{R}^3 $ case. Here is how I proved it. Is it right and if so rigorous? Even if yes, are there more ways to prove this, I'm really curious. I put two ways, I'm not sure if either is right, or maybe one of them is rigorous and the other one is not. Could you please point out flaws in the proofs if the idea is right but it is not rigorously explained? I'm new to proofs and it's summer so I can't annoy my professors. Thanks.

$\\(I-A)^k=0 \\ det(I-A)^k=0 \Rightarrow det(I-A)=0 \\ \text{Therefore there exists a non-zero vector x, }\\ (2) \quad(I-A)x=Ix-Ax=0 \\ \text{There exists non trivial x s.t. } Ax=\lambda_{2}x \\ \text{So now } (I-A)x=x-\lambda_2x=(1-\lambda_2)x \\ \text{Multiplying both sides by (A-I) k times, we get } 0=(1-\lambda)^kx \\ \text{Since x is non-trivial, }1-\lambda=0 \Rightarrow \lambda=1 \\ \therefore \text{A has eigenvalue 1 of multiplicity k, and so A is invertible} $

4

There are 4 best solutions below

3
On

Best is to denote $B\colon = I- A$, so $A = I-B$. Now $B$ is nilpotent, ( $B^k = 0$). Check that $$( I + B + B^2 + \cdots B^{k-1})(I-B) = I$$

With your method: assume that $A v = 0$. Then $(I-A)v = v$, so by induction, $(I-A)^n v = v$ for all $n\ge 1$. Now, for $n=k$ we get $(I-A)^k v = v$. But the LHS is $0$, so $v=0$, and thus $A$ is injective, and therefore has an inverse $A^{-1}$.

7
On

About your attempt:

Your (1) is incorrect. You know that $(I-A)^k$ is the zero matrix, but you do not know that $(I-A)x=0$. There is absolutely no warrant for asserting that.

Your (2) shows that if $\lambda_2$ is an eigenvalue of $A$ and $x$ is an eigenvector corresponding to $\lambda_2$, then it is an eigenvector of $I-A$ corresponding to $1-\lambda_2$. It would be better to state it that way.

Your Method 1 is incorrect, since it assumes that $I-A=0$, which was not warranted.

Method 2 is also incorrect, because again you assume that $I-A=0$, which is not warranted.


Easy to prove facts:

  1. If $\lambda$ is an eigenvalue of $A$, then $k-\lambda$ is an eigenvalue of $kI-A$, for any scalar $k$.

  2. If $\lambda$ is an eigenvalue of $B$, then $\lambda^k$ is an eigenvalue of $B^k$.

  3. $C$ is invertible if and only if $\lambda=0$ is not an eigenvalue of $C$.

Using these facts: If $\lambda$ is an eigenvalue of $A$, then $1-\lambda$ is an eigenvalue of $I-A$ (what you prove in (2)), and $(1-\lambda)^k$ is an eigenvalue of $(I-A)^k$. Since $(I-A)^k=0$, the only eigenvalue of $(I-A)^k$ is $0$. Therefore...

1
On

Since $[A,I]=0$, we have

$$0=(I-A)^k=\sum_{i=0}^k\binom{k}{i}(-1)^iA^i I^{k-i}=I+\sum_{i=1}^k\binom{k}{i}(-1)^iA^i$$

Rearranging gives

$$I=-\sum_{i=1}^k\binom{k}{i}(-1)^iA^i=A\left[-\sum_{i=1}^k\binom{k}{i}(-1)^iA^{i-1}\right]$$

as desired.

0
On

Let's try this with minimal polynomial argument: Consider $p(x)=(1-x)^k$ . Then $A$ is a matrix which satisfies the polynomial $p(x)$. Now the minimal polynomial of $A$ has to be a divisor of $p(x)$( This fact can be easily proven using the divison algorithm). So the minimal polynomial of $A$ has to be of the form $m(x)=(1-x)^r$ where $r\leq k$. So the minimal polynomial of $A$ has roots 1 with multiplicity $r$. Now the roots of the minimal polynomial of $A$ are exactly the eigen values od $A$. So eigen values of $A$ is 1 and this is the only eigen value. So $0$ is not an eigen value of $A$ hence $A$ is invertible.