Dealing with proofs in linear algebra that seem trivial

115 Views Asked by At

I have a real trouble in linear algebra approaching proofs in general, but especially when a proof seems so trivial, I don't actually know where to start! For example:

Show that if ${[v_1, . . . , v_n]}$ is a basis of $V$ then every vector v$V$ can be uniquely written as $$v = α_1v_1 + · · · + α_nv_n$$ for some $α_1, . . . , α_n ∈ F$.

Here this just seems like trivial knowledge to me. I know that if a set of vector is a basis for a space, it spans that space and is linearly independent. That is just the definition of a basis. If I try to structure this into a proof, I just end up repeating the question! Is this all I need to say? Or am I missing something more basic here?

Thanks.

3

There are 3 best solutions below

2
On BEST ANSWER

In questions like this, if you're still new to doing mathematics in general, I think it's really important to be really systematic:

In your case, $\{v_i\}_{1\leq i\leq n}$ is a basis for $V$. This means that $a)$, $\{v_i\}_{1\leq i\leq n}$ is a spanning set and $b)$, $\{v_i\}_{1\leq i\leq n}$ is a linearly independent.

So... given some $v\in V$, we need to show that it has a unique representation as a linear combination of $v_i$'s. So first, we need to show that it has such a representation. This follows from the fact that $\{v_i\}_{1\leq i\leq n}$ is spanning - this is literally just the definition of being spanning, so here, it is alright to say that it's obvious.

Next, we need to show uniqueness, and as you indicated, this should follow from linear independence. However, in order to do this, we need to reduce the question to a question of linear independence.

So, assume that $v=\sum_{i=1}^n \alpha_iv_i=\sum_{i=1}^n \beta_i v_i$. Then, we see that $0=\sum_{i=1}^n (\alpha_i-\beta_i)v_i$ and by definition of linear independence, this means that $\alpha_i-\beta_i=0$ for all $i$. This implies the desired.

The point of this type of exercise is typically to practice proof techniques, and the important thing, in my experience, is just to attempt to be completely systematic and never claim that something is just obvious (even though it may very well be).

0
On

This is certainly "close to trivial" in some sense, the proof (in fact, the proof of the on-the-face-of-it stronger result that if $I$ is linearly independent then every $w\in span(I)$ can be written in exactly one way as a linear combination of elements of $I$) being one line long:

Supposing $\sum\alpha_iv_i=\sum\beta_iv_i$, we have $\sum(\alpha_i-\beta_i)v_i=0$; since $\{v_1,...,v_n\}$ is linearly independent, we have $\alpha_i-\beta_i=0$ for all $i$, that is $\alpha_i=\beta_i$ for all $i$.

However, I'd argue that it is not in fact trivial. "$B$ is a basis" means the following:

  • Every vector can be written as at least one linear combination of elements of $B$. ("$B$ spans $V$.")

  • The specific vector $0$ cannot be written as a nontrivial linear combination of elements of $B$ - or put another way, there is only one way to write the specific vector $0$ as a linear combination of elements of $B$, namely the trivial linear combination. ("$B$ is linearly independent.")

That second condition is the really relevant one here, but just in terms of phrasing it falls short of saying "No vector can be written in two ways as a linear combination of elements of $B$" - it's phrased just for $0$. The jump from this to the "all-vectors" version is very short, but is not quite trivial.

0
On

The converse is also true and not so trivial.

The converse states: If any vector v ∈ V can be written uniquely as a linear combination of the n vectors v1,...,vn in V, then v1,...,vn form a basis for V.

To show that v1,...,vn form a basis, we show that (i) they span V (ii) they are linearly independent in V.

That they form a spanning set is given since any vector v ∈ V can be written uniquely as a linear combination of the n vectors.

To show that they are linearly independent in V, suppose (for contradiction) that one vector v_i is a linear combination of the others. Equivalently ∑ a_i v_i=0, with the a_i not all 0. But 0 =∑ 0 v_i. Unique expression implies that a_i=0 for all i, a contradiction. So no vector is linearly dependent on the others. Hence v_1,...v_n is linearly independent, as required.

You may say obvious. But all true statements are obvious!