How to prove a matrix and a diagonal matrix are similar? Is there some rule to follow or are there some steps to follow?
Like: suppose a matrix $A$ = something, prove that $A$ is similar to a diagonal matrix, or prove that it isn't... What to do in such a problem?
Here's the idea behind diagonalization: An $n\times n$ matrix $A$ is similar to a diagonal matrix $D=\mathrm{diag}(\lambda_1,\cdots,\lambda_n)$ if and only if an $n\times n$ invertible matrix $P = \begin{bmatrix} P_1 & \cdots & P_n \end{bmatrix}$ exists such that $$ A = PDP^{-1}$$
Equivalently $$AP=PD$$
Writing it as column vectors, we have
$$\begin{bmatrix} AP_1 & AP_2 & \cdots &AP_n \end{bmatrix} = \begin{bmatrix} \lambda_1P_1 & \lambda_2 P_2 & \cdots & \lambda_n P_n\end{bmatrix}$$
Obviously, $AP_i = \lambda_i P_i$ for $i=1,2,\cdots,n$ is satisfied when $P_i$ is the eigenvector associated to $\lambda_i$. This suggests that if the matrix $A$ has $n$ linearly independent eigenvectors, we can diagonalize it. It is shown rigorously in standard textbooks that this is necessary and sufficient.