Here is the formal statement:
Let $\lambda_1, \lambda_2, \lambda_3$ be distinct eigenvalues of $n\times n$ matrix $A$. Let $S=\{v_1, v_2, v_3\}$, where $Av_i = \lambda_i v_i$ for $1\leq i\leq 3$. Prove $S$ is linearly independent.
Many resources online state the general proof or the proof for two eigenvectors. What is the proof for specifically 3? I tried to derive the 3 eigenvector proof from the 2 eigenvector proofs, but failed.
Here's one idea that comes to mind, although I don't promise there isn't a slicker way to do it. Suppose $c_1v_1 + c_2v_2 + c_3v_3 = 0.$ Applying $A$ gives $$\lambda_1c_1v_1 + \lambda_2c_2v_2 + \lambda_3c_3v_3 = 0.$$ On the other hand, multiplying the original equation by $\lambda_1$ gives $$\lambda_1c_1v_1 + \lambda_1c_2v_2 + \lambda_1c_3v_3 = 0.$$ Comparing the two displayed equations, we get $$(\lambda_2-\lambda_1)c_2v_2 + (\lambda_3-\lambda_1)c_3v_3 = 0.$$ Since you say you can prove any two eigenvectors corresponding to distinct eigenvalues are linearly independent, you now know that $(\lambda_2-\lambda_1)c_2 = (\lambda_3-\lambda_1)c_3 = 0.$ But since all the $\lambda_i$ were distinct, this means $c_2 = c_3 = 0.$ Thus the original equation says $c_1v_1 = 0,$ but since eigenvectors are by definition non-zero, we see that $c_1 = 0,$ completing the proof.
Notice the inductive nature of the proof -- the same idea will work for $n$ eigenvectors.