Given $\ker T$, what can we conclude about $\ker (T-I)$?

445 Views Asked by At

Let $T:V\to V$ be a linear transformation between vector space $V$ and itself.

Suppose we know everything about $\ker T$.

Q1) Will this enable us to know anything about $\ker (T-I)$, where $I$ is the identity transformation? How about just the dimension $\dim\ker(T-I)$, will it be possible to know that or have an upper/lower bound?

Q2) If we add further conditions like $T$ is self-adjoint, and non-negative, meaning $\langle Tx,x\rangle\geq 0$ for all $x$. Will it give us more information on $\ker (T-I)$?

Thanks a lot. Any bit of help will be appreciated and upvoted by me.

4

There are 4 best solutions below

0
On

The kernel of $T-I$ will be nontrivial only if $Tv=v$ for some vector $v$ and I cannot see how knowing $\text{ker}(T)$ can give you any information about that, even if $T$ is non-negative and self-adjoint.

0
On

Ker $T$ and Ker $(T-I)$ are both eigenspaces of $T$ associated with different eigenvalues, therefore the sum of their dimensions is at most $n$, or $$\text{dim Ker}(T-I)\leq n-\text{dim Ker}T$$

That's about the only thing you can say.

1
On

The answer is negative, even for the second question. Take $f$ and $g$ from $\mathbb{R}^2$ into itself, where $f$ is the identity and $g(x,y)=(x,2y)$. In both cases, the kernel has dimension $0$. But if you subtract the identity from each of them, you'll get kernels with distinct dimensions.

1
On

Consider linear transformations expressed with matrices. Kernel is linked with eigenvalues equal to $0$ (we have $Av=0$) and if $T$ has eigenvalues $\lambda_i$ then $T-I$ has eigenvalues $\lambda_i-1$. So everything depends on how many $0$ eigenvalues there are for $T$ (they are transformed into $-1$ in $T-I$) and how many $1$ eigenvalues there are for $T$ (they are transformed into $0$ in $T-I$).
The result can vary vastly when we substract $I$ from $T$ for example $ 2I-I= I$ but $I-I=0$ ...