Linear independence of continuous functions

152 Views Asked by At

I've come up with a proof of the theorem below (an "exercise left to the reader" in Gel'fand's Lectures on Linear Algebra), but it feels a bit flimsy to me.

My questions are:

  1. Is the reasoning here correct? Specifically, is the logic jump in the punchline where we invoke the fundamental theorem of algebra, and thereby claim our polynomial is not everywhere zero a reasonable one?

  2. Do we need to invoke the fundamental theorem of algebra here? Is there a simpler proof? I'm specifically avoiding use of Wronskians or other linear-algebraic machinery not developed yet by Gel'fand

Theorem

Let $R$ be the space of continuous functions. Let $n$ be any natural number.

Then the functions $A = \{ f_1, f_2, ..., f_n \}$, where $f_1(t) = 1$, $f_2(t) = t$, ... , $f_n(t) = t^{n-1}$ form a linearly independent set

Proof

We proceed by contradiction. Suppose $A$ is linearly dependent.

Then we can choose $\eta_1, \eta_2, ..., \eta_n$ not all $0$ such that

$\eta_1 f_1 + \eta_2 f_2 + ... + \eta_n f_n = 0(t)$

where $0(t)$ is the function that is everywhere zero.

Rewriting in terms of $t$, we have

(I) $\eta_1 + \eta_2 t + ... + \eta_n t^{n - 1} = 0(t)$

Let $n'$ be the largest exponent of $t$ in (I) that has a non-zero coefficient $\eta_i$.

By the fundamental theorem of algebra, the polynomial in (I) has exactly $n'$ complex roots.

But this is a contradiction -- if (I) has exactly $n'$ complex roots, it is not everywhere zero.

Then it must be that our original assumption that $A$ was linearly dependent is false. Then $A$ is linearly independent. $\square$