Proving linear independence of $e^{a_it}, i=1,...,n$

45 Views Asked by At

As the title states, I'm trying to prove that a system $e^{a_1t},e^{a_2t},...,e^{a_nt}$ defined on $C[a,b]$ is linearly independent if $a_i$ are distinct ($a_i\neq a_j$ if $i\neq j$). So I put these functions into Wronskian and try to evaluate value of the determinant. $$ W\left[e^{\alpha_it}\right](t) = \begin{vmatrix} e^{\alpha_1t} & \cdots & e^{\alpha_n t}\\ \vdots & \cdots & \vdots\\ \alpha_1^{n-1}e^{\alpha_1t} & \cdots & \alpha_n^{n-1}e^{\alpha_nt} \end{vmatrix} = \exp\left(\sum_{i=1}^n\alpha_i t\right) \begin{vmatrix} 1 & \cdots & 1\\ \vdots & \cdots & \vdots\\ \alpha_1^{n-1} & \cdots & \alpha_n^{n-1} \end{vmatrix} $$ However, I don't know how to prove the determinant is not zero. I have tried method of mathematical induction but I can't find a clear relationship for $n=k$ and $n=k+1$. So, how to prove that the determinant has a non-zero value?

1

There are 1 best solutions below

1
On BEST ANSWER

You arrived at the transpose of the famous Vandermonde matrix with the determinant $$ \det M = \prod_{1\le i < j \le 1} (\alpha_j - \alpha_i), $$ which is nonzero iff the $\alpha_i$ are distinct.

The proof is in the linked Wiki article.