Let $E$ be a linear operator on a finite-dimensional vector space $V$ such that $E^2=E.$ Then we know that $E$ is a projection and in particular, every $v\in V$ can be uniquely written as $v=Ev+(v-Ev).$
If $\mathcal{B}_1$ are the basis for the $\text{Im}(E)$ and $\mathcal{B}_2$ for $\ker (E)$ then I know that the basis for $V$ is the set $\mathcal{B}=\mathcal{B}_1\cup \mathcal{B}_2.$ Now I want to show that the matrix $[E]_{\mathcal{B}}$ is diagonalizable. Note that a similar question has been already asked on this website, however, I am more interested in knowing whether the following will work and if not why?
So I know that if $\mathcal{B}=\{\alpha_1,\alpha_2,...,\alpha_r,\alpha_{r+1},...,\alpha_{n}\}$ where the first $r$ elements are in $\mathcal{B}_1$ and the remaining in $\mathcal{B}_2.$ The the elements of the matrix are $1\leq j\leq r$
$$E(\alpha_j)=\sum_{i=1}^{n}a_{ij}\alpha_{i}.$$ And $0$ elsewhere because if we pick $\alpha_{r+1}$ for instance then $E(\alpha_{r+1})=0$ and so $$0=\sum_{i=1}^{n}a_{ir+1}\alpha_{i}\implies \text{ all the coefficients are 0 as the vectors are linearly independent}. $$
So the goal is to somehow show that for each $1\leq j\leq r$ we have $a_{ij}=0$ for each $i\not =j.$ This is where I am facing some trouble.
Maybe we can go by contradiction. Say $a_{ij}=k\not=0$ where $i\not =j.$
Edit:
How about $$E(\alpha_j)=\sum_{i=1}^{n}a_{ij}\alpha_{i}\implies E(\alpha_{j}-\sum_{i=1}^{r}a_{ij}\alpha_{i})=0$$ Thus $\alpha_{j}-\sum_{i=1}^{r}a_{ij}\alpha_{i}=0$. Implying that $a_{ij}=\delta_{ij}.$
Since $E^2 = E$, the minimal polynomial $m_E(x)$ of $E$ divides $x^2-x = x(x-1)$. So $m_E(x)$ is either $x$, $x-1$, or $x(x-1)$. Since it is a product of distinct monic linear factors, $E$ is diagonalizable.