How to show $\{u_1,\sum_i^2{u_i},\cdots,\sum_i^n{u_i}\}$ is linearly independent if $\{u_1,{u_2},\cdots,{u_n}\}$?

99 Views Asked by At

Suppose $\{u_1,{u_2},\cdots,{u_n}\}$ is linearly independent we need to show $\{u_1,\sum_i^2{u_i},\cdots,\sum_i^n{u_i}\}$ is linearly independent. Therefore, suppose

$$ \alpha_1u_1+\alpha_2\sum_i^2{u_i}+\cdots+\alpha_n\sum_i^n{u_i}=0 $$ for some $\alpha_1,\alpha_2,\cdots,\alpha_n \in \mathbb{R}$.

We need to show $\alpha_1=\alpha_2=\cdots,\alpha_n=0$.

I know we have to come up with something that uses linearly independence of $\{u_1,{u_2},\cdots,{u_n}\}$ but I don't know how?

Extension: How can we prove the reverse?

4

There are 4 best solutions below

17
On BEST ANSWER

Suppose $(\lambda_1,...,\lambda_n)$ is a set of coefficients such that $$ 0 = \sum_{i=1}^n \lambda_iv_i \tag{1} $$ where $v_i = \sum_{k=1}^i u_k$. We want to show that then one must have $ \lambda_i=0$ for all $1\leq i\leq n$.

We can rewrite this as $$ 0 = \sum_{i=1}^n \lambda_i \sum_{k=1}^i u_k= \sum_{k=1}^n \sum_{i=k}^n \lambda_i u_k = \sum_{k=1}^n \left(\sum_{i=k}^n \lambda_i\right) u_k \tag{2} $$ and so, by independence of the $u_k$'s, we must have $$ \sum_{i=k}^n \lambda_i = 0 \qquad \forall 1\leq k\leq n \tag{3} $$ Show that this implies $ \lambda_i=0$ for all $1\leq i\leq n$ (e.g., by induction, taking $k$ from $n$ to $1$ in (2)).


To prove the converse: assume, again letting $v_i = \sum_{k=i}^n u_i$, that $(v_1,\dots,v_n)$ is linearly independent. Let $\alpha_1,\dots,\alpha_n$ such that $$ 0 = \sum_{i=1}^n \alpha_i u_i $$ Noting that $u_i = v_i - v_{i+1}$ (for $1\leq i<n$, we get $$ 0 = \alpha_n v_n + \sum_{i=1}^{n-1} \alpha_i v_i - \sum_{i=1}^{n-1} \alpha_i v_{i+1} = \alpha_n v_n + \sum_{i=1}^{n-1} \alpha_i v_i - \sum_{i=2}^{n} \alpha_{i-1} v_{i} = \alpha_1 v_1 + \sum_{i=2}^{n} (\alpha_i - \alpha_{i-1}) v_{i} $$ By our assumption of linear independence of the $v_i$'s, we have $0 = \alpha_1 = \alpha_i - \alpha_{i-1}$ (for all $i$). Again by induction, this implies that $\alpha_i = 0$ for all $i$. That shows that $(u_1,\dots,u_n)$ is linearly independent.

0
On

Here is how to continue: if you write out the terms in the sum you can factor the $u_i's$ to obtain

$$ ( \alpha_1 + \alpha_2 + ... + \alpha_1 ) u_1 + (\alpha_2 + ... + \alpha_n) u_2 + ... + \alpha_n u_n = 0$$

Now use that the u's are LI to obtain a upper triangular system which quite trivially gives $u_i = 0$ for all i by doing back subtitution and the result follows.

0
On

The matrix that changes one basis to the other set has determinant $1$, since it is upper triangular with all $1$s above and on the diagonal. So you get a basis, since you are applying an invertible matrix to one. For example, when $n=3$, you matrix looks as follows:

$$\begin{pmatrix} 1 & 1 & 1 \\ 0 & 1 & 1 \\ 0 & 0 & 1 \end{pmatrix}.$$

0
On

A set of vectors $S$ is linearly independent iff the space $\langle{S}\rangle$ it generates has dimension $\#S$. But $V = \langle{v_1, \dots, v_n}\rangle$ and $V' = \langle{v_1, v_1 + v_2, v_1 + v_2 + v_3, \dots\rangle}$ coincide: clearly $V'\subset V$, and $$v_i = (v_1 + \cdots + v_i) - (v_1 + \cdots + v_{i-1}).$$

Alternatively, the vectors $v'_i = v_1 + \cdots + v_i$ have $v'_i = gv_i$ for \begin{align*} g = \begin{pmatrix} 1 & 1 & \cdots & 1 \\ 0 & 1 & \cdots & 1 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & 1 \end{pmatrix} \end{align*} Clearly $g$ is invertible (e.g., $\det g = 1$), so $\dim V' = \dim g(V) = \dim V$.