Diagonalization of a square matrix $A$ consists in finding matrices $P$ and $\Delta$ such that $A=PD P^{-1}$ where $D$ is a diagonal matrix.
What theorem tells us that $P$ is a matrix composed of the eigenvectors of $A$, $D$ is the diagonal matrix constructed from the corresponding eigenvalues, and $P^{-1}$ is the matrix inverse of $P$? I'm also interested in the proof.
you don't need a theorem for this. $$A = PDP^{-1} \implies AP = PD = P \, diag(d_1, d_2, \cdots, d_n)$$ let the columns of $P$ are $u_1, u_2, \cdots, u_n.$ then the matrix multiplication gives you $$Au_1 = d_1 u_1, Au_2 = du_2, \cdots, Au_n = d_nu_n $$
now argue that nonsingular matrix have nonzero columns, therefore $u_j$ is an eigenvector of $A$ corresponding to the eigenvalue $d_j$ for $j = 1, 2, \cdots, n.$