Theorem: If $\{e_1,\ldots,e_n\}$ is an orthonormal basis of $V$ such that each $e_j$ is an eigenvector, then the matrix of $A$ with respect to this basis is diagonal, and the diagonal elements are precisely the eigenvalues.
Please prove in simple terms as I am not very experienced with linear algebra.
The simplest proof:
$\forall e_j, A e_j = \lambda_j e_j$, where $\lambda_j$ is the eigenvalue.
In any basis $B$ which includes $e_j$ as $k$-th basis vector, the $k$-th column of $A$ in basis $B$ is $A e_j$. So $A e_j$ is composed of $\lambda_j$ on the diagonal and zeroes everywhere else. And that's all.
So if $B$ is entirely made of eigenvectors, $A$ is diagonal and has on its diagonal the eigenvalues.
But if $B$ contains non-eigenvectors, columns of $A$ corresponding to eigenvectors are still entirely zeroes, except the diagonal elements which are the eigenvalues.
This proof implicitely uses the fact that a basis is an independent set of vectors: a matrix has a unique representation in this basis, and from $A e_j = \lambda_j e_j$ we directly get the corresponding column of $A$.
If we want to make explicit this use of the independence property, this is the proof stated in the comments:
Basis $B = \{e_1, \dots, e_n\}$. Let $A = (a_{ij})_{i,j=1\dots n}$ in this basis.
In basis $B$, $e_j$ is only zeroes, except a $1$ on the $j$-th element.
Then $A e_j$ in basis $B$ is the $j$-th column of $A$ in this basis: $A e_j = \sum_{i=1 \dots n} a_{ij} e_i$.
But also $A e_j = \lambda_j e_j$.
So $a_{0j} e_0 + a_{1j} e_1 + \dots + (a_{jj} - \lambda_j) e_j + \dots + a_{nj} e_n = 0$.
This is a linear combination of independent vectors equal to zero: it implies each coefficient is null.
So we get $a_{ij} = 0$ when $i \ne j$, and $a_{jj} = \lambda_j$.