So 1 is neither a eigenvalue of $A^2$ nor of $A^3$. But I don't know what to make of that.
2026-04-25 16:17:12.1777133832
On
On
Let $A$ be a square matrix, such that $A^3 = I$. If 1 is not an eigenvalue of $A$, show: $A^{2} + A + I = 0$
331 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
3
There are 3 best solutions below
2
On
Since an answer has been given, here is a hint of another way to proceed.
Let $B=A^2+A+I$ and for an arbitrary vector $x$ consider the vector $y=Bx$.
0
On
$t^3-1$ is anihilating polynomial for $A$. I assume that $A$ is complex matrix and $1$ is not an eigenvalue, so possible minimal polynomials will be any of the $t-\omega, t-\omega^2, (t-\omega)(t-\omega^2)$.
This gives $A=\omega.I, A=\omega^2. I$ or $(A-\omega)(A-\omega^2)$ as the possibility for $A$, each of which satisfy $A^2+A+I=0$.
We have that $0=A^3-I=(A-I)(A^2+A+I)$. Since $1$ is not an eigenvalue of $A$ then $A-I$ is invertible and thus $A^2+A+I=0$.