Linear independent set of functions

396 Views Asked by At

I have the following set of functions $S=\{f_i \mid i\in\mathbb{N}\}$ with $f_i \in\operatorname{Map}(\mathbb{N},\mathbb{Q})$ and $$f_i(n)= \begin{cases} n, & \mbox{if}\quad n\le i; \\ 0, & \mbox{if}\quad n>i. \end{cases}$$

I must show that the set $S$ is linear independent.

I have the intuition that I can solve this by induction, but I am not sure how. Can you guys please help me?

2

There are 2 best solutions below

2
On

Well, $S$ is linearly independent iff, for an arbitrary finite set of indices $I$ such that $i_p \neq i_q$ for all $p,q\in I$, $p\neq q$, we have $$\sum_{p\in I} \alpha_p f_{i_p} = 0 \quad \Rightarrow \quad \alpha_q = 0 \quad \forall q\in I.\tag1$$

For now, suppose that $I = \{a,b\}$ and call $i_a =: i$, $i_b =: j$. Suppose, in particular, that $i< j$. Then, for all $k\in\mathbb N$ is such that $i <k\leq j$, we have that $f_i(k) = 0$, while $f_j(k) = k$. Now fix such a $k$. We know by hypothesis that $\alpha_i f_i + \alpha_j f_j$ must be the zero function, i.e. $$ \alpha_i f_i(m)+ \alpha_j f_j(m) = 0, \qquad \forall m \in \mathbb N, \tag2$$ and so in particular for $m=k$, i.e. $$ \alpha_i\cdot 0 + \alpha_j \cdot k= 0, $$ which entails $\alpha_j = 0$. Therefore equation $(2)$ becomes $$\alpha_i f_i(m) = 0, \qquad \forall m \in \mathbb N, $$ which implies $\alpha_i = 0$. Then condition $(1)$ is satisfied for a set of two indices and we are done.

But what if $I$ has cardinality greater than $2$? Suppose $I$ contains the elements $a,q,b$ such that $i_a < i_q < i_b$. Since $i_q$ is in the middle, then we may find $i_a < k_1 \leq i_q$ and $i_q < k_2 \leq i_b$; then we have the system $$\begin{cases} \alpha_a f_{i_a} (k_1) + \alpha_q f_{i_q} (k_1) + \alpha_b f_{i_b}(k_1) = \alpha_a \cdot 0 + \alpha_q \cdot k_1 + \alpha_b \cdot k_1 = 0 \\ \alpha_a f_{i_a} (k_2) + \alpha_q f_{i_q} (k_2) + \alpha_b f_{i_b}(k_2) = \alpha_a \cdot 0 + \alpha_q \cdot 0 + \alpha_b \cdot k_2 = 0 \end{cases}$$ Hence, from the second equation, $\alpha_b = 0$, which implies $\alpha_q = 0$ in the first equation, and $\alpha_a = 0$ from the hypothesis $\alpha_a f_{i_a}+ \alpha_q f_{i_q} + \alpha_b f_{i_b} = 0$. We’re done. This process can be generalized for any finite value of $|I|$.

0
On

For starters, you should get a better idea of what we need to prove, so you should check out the definition of linear dependence and independence of a collection of vectors. Proving that an infinite set is linearly independent amounts to proving the following:

For any finite subset of indices $i_1,i_2,\ldots,i_k$, the equality $c_1f_{i_1}+c_2f_{i_2}+\cdots+c_kf_{i_k}=0$ implies that all coefficients are zero, i.e. $c_1=c_2=\cdots=c_k=0$.

And I agree that you indeed can approach it using induction by the number of functions in such a linear combination. The induction base would be $k=1$, i.e. a proof for a single function. (It's going to be ridiculously simple, but it has to be done!)

For an induction step, let's say you want to prove that the statement holds for $k=m$ assuming it's true for $k=m-1$. The key idea will be as follows: assume $c_1f_{i_1}+c_2f_{i_2}+\cdots+c_mf_{i_m}=0$ (meaning the identically zero function); without loss of generality, we can assume that $i_m$ is the largest of all indices involved (after all, we can always sort them); then from the definition of these functions $f_i(n)$ we can show that $c_m=0$; and now we are left with a linear combination of $m-1$ functions, so the induction hypothesis finishes the proof.