If $T\in L(V)$ is diagonalizable then show that $R(T)\cap N(T)=\{0\}$
My Attempt:
Case 1: $V$ is finite dimensional
Subcase 1: $T$ is invertible
That means $N(T)=\{0\}$
Directly $R(T)\cap N(T)=\{0\}$
Subcase 2: $T$ is not invertible
This implies there is $u\ne 0 ,u\in N(T)$
As $T$ is digonalizable then $T$ has eigenvalues $\{0,a_1,...,a_m\}$
On contarary suppose
$u\neq 0,u\in R(T)\cap N(T)$ then
$T(u)=0$ also $u\in R(T)$ therefore $T(w)=u , w\neq 0 ,w\in V$
As $u\in E(0,T)$ eigenspace of $T$ relative to $0$
So by Direct sum property of eigenspace
$u\notin E(a,T)$ i.e $(T-aI)u \neq 0$ that is $Tu\neq au$
But I am not able to argue to get required.
Case 2: $V$ is infinite dimension
For that we also have to proceed same or it require some other argument?
Any help will be appreciated
Let $\{b_1,\ldots,b_n\}$ be a basis such that $Tb_i = \lambda_i b_i$ for $i =1, \ldots, n$.
WLOG assume that $\{b_1, \ldots, b_k\}$ is a basis for $N(T)$ and $\lambda_j \ne 0$ for $j = k+1, \ldots, n$.
Assume $y \in R(T) \cap N(T)$. Then $y = \sum_{i=1}^k \alpha_i b_i$ and $\exists x = \sum_{i=1}^n \beta_i b_i \in V$ such that $y = Tx$.
We have
$$ \sum_{i=1}^k \alpha_i b_i = y = Tx = \sum_{i=1}^n \beta_i Tb_i = \sum_{i=1}^n \beta_i \lambda_ib_i = \sum_{i=k+1}^n \beta_i \lambda_ib_i$$
so since $\{b_1,\ldots,b_n\}$ is a basis we conclude $\alpha_{1} = \cdots = \alpha_{k} = 0$ so $y = 0$.
Therefore $R(T) \cap N(T) = \{0\}$.
Alternatively, if $V$ is a vector space over $\mathbb{R}$ or $\mathbb{C}$, we can define an inner product $\langle \cdot, \cdot\rangle$ on $V$ as $$\left\langle \sum_{i=1}^n \alpha_i b_i, \sum_{i=1}^n \beta_i b_i \right\rangle = \sum_{i=1}^n \alpha_i\overline{\beta_i}$$
Now notice that $\{b_1,\ldots, b_n\}$ is an orthonormal basis w.r.t. $\langle \cdot, \cdot\rangle$. Since $T$ diagonalizes in an orthonormal basis, $T$ is normal.
Then we have $R(T) = N(T)^\perp$ so in particular $R(T) \cap N(T) = \{0\}$.