Prove that if the $n$ eigenvalues of a matrix $A_{n\times n}$ are distinct then
1) there are $n$ eigenvectors {$\bar c_i$}), one corresponding to each of those eigenvalues and
2) that {$\bar c_1,\bar c_2, \dots,\bar c_n$} are all linearly independent.
I will seek to prove that if the $n$ eigenvalues of a matrix $A_{n\times n}$ are distinct then 1) there are $n$ eigenvectors {$\bar c_i$}), one corresponding to each of those eigenvalues
So if a eigenvalue $\lambda_j$ repeats $k$ times then $a(\lambda_j) = k$.
So in the case of distinct eigenvalues ${\lambda_1,\ldots,\lambda_n}$, we can conclude that for each of the $n$ eigenvalues, the algebraic multiplicity is 1.
$$a(\lambda_i) = 1 \space \space \space \forall \space \space\lambda_i$$
Now what will the geometric multiplicities of these eigenvalues be?
The eigenspace $E_{λ_i}$ is best understood as the vector space spanned by all the eigenvectors that correspond to the eigenvalue ${λ_i}$, i.e. the collection of all vectors $\bar v$ that satisfy $A\bar v = λ_i\bar v$ form the eigenspace.
An eigenspace has dimension greater than zero by definition. Since the definition of an eigenvalue is : λ is an eigenvalue of A if $Ax=λx$ for some $x≠0$. Since only the zero vector by itself has a dimension of zero,we can conclude $0 < g(λ_i)$ or $1 ≤ g(λ_i)$.
This is a brilliant proof for why $g(λ_i)≤a(λ_i)$, it shows that the characteristic polynomial will have $(\lambda - λ_i)^{g(λ_i)}$ at least as a factor.
$$1 ≤ g(λ_i) ≤a(λ_i)$$
Thus in the case of distinct eigenvalues,
$$1 ≤ g(λ_i) ≤ 1 \space \space \space \forall \space \space\lambda_i$$
$$g(λ_i) = 1 \space \space \space \forall \space \space\lambda_i$$
$g(λ_i)$ is also equivalently the number of independent eigenvectors associated with $λ_i$, since if we want to span a vector space (the eigenspace) of dimension $g(λ_i)$, we need that many independent eigenvectors.
Thus we have proved that associated with each distinct eigenvalue is a single eigenvector.