I was wondering about a property when Diagonalizing Matrices, so if we have the form
A = C D C $^{-1}$
Then the matrix C must have linearly independent columns so it can be invertible and solve the equation. If I'm correct that means the eigenvectors of A are always linearly independent since those make up the matrix C?
If that is true, we already know D has linearly independent columns consisting of the eigenvalues of A. So if we were given the columns for C and D, is it possible to only have one version of A or are there multiple choices of A? I'm asking this because as far as I'm aware, the C and D matrices can be arbitrarily picked from the eigenvectors/values, so this leads me to believe that there are different version of A depending on how C/D was chosen to be? However, I think I am missing something because I feel like the arbitrary choosing of C and D shouldn't affect the matrix A?
Diagonalizability means that you can find a basis for that space that consists entirely of eigenvectors. That is, that you can find a linearly-independent set of eigenvectors that span the entire space. It does not mean that any old set of eigenvectors that you pick is automatically linearly independent. After all, any nonzero scalar multiple of an eigenvector is also an eigenvector. For an extreme example, consider the identity matrix or the zero matrix. Every vector is an eigenvector of those matrices.
On the other hand, the matrix $C$ that diagonalizes $A$ is not unique, at least in vector spaces over fields of characteristic zero. The columns of $C$ can be any ordered basis of eigenvectors. For instance, simply permuting the columns of $C$ or multiplying each column by some (perhaps different) nonzero scalar gives you a different matrix that also diagonalizes $A$. The two matrices do have to be “coordinated” in that the eigenvectors that are the columns of $C$ must match the corresponding eigenvalues on the diagonal of $D$.