I'm interested in eigendecomposition of a matrix.
It is clear for me, that you can eigendecompose a matrix if and only if it is diagonalizable.
I'm looking for a short proof for that statement, that the columns of the similarity matrix of a diagonalization are the eigenvectors.
Here's the $2 \times 2$ case you can generalize:
Suppose $M$ is an arbitrary $2 \times 2$ matrix with two eigenvalues, $\lambda_1, \lambda_2$, and hence two corresponding eigenvectors, $v_1$ and $v_2$. Define a matrix $V$ whose columns are those two eigenvectors,
\begin{equation} V =\left( \begin{matrix} v_1 & v_2 \\ | & | \end{matrix} \right) \end{equation} where those vertical bars indicate the column.
Then the action of $M$ on $V$ is the same as the action of $M$ on the individual columns. As those columns are also eigenvectors we may write
\begin{equation} MV =\left( \begin{matrix} Mv_1 & Mv_2 \\ | & | \end{matrix} \right) = \left( \begin{matrix} \lambda_1v_1 & \lambda_2v_2 \\ | & | \end{matrix} \right) = \underbrace{\left( \begin{matrix} v_1 & v_2 \\ | & | \end{matrix} \right)}_{V} \underbrace{\left( \begin{matrix} \lambda_1 & 0 \\ 0 & \lambda_2 \end{matrix} \right)}_{D} \end{equation}
where $D$ is the diagonal matrix of eigenvalues. That is, $MV = VD$. Or
\begin{equation} M = VDV^{-1} \end{equation}