Purpose of a vector space

4.7k Views Asked by At

I am currently studying linear algebra and I've seen what vector spaces are, but I can't seem to understand what their purpose is.

What do they allow us to do?

3

There are 3 best solutions below

1
On BEST ANSWER

Vector spaces provide, among other things, excellent ways to analyze systems of linear equations, e.g. $$2x+3y+4z = 5 \\ 3x+4y+5 = 6 \\ x+y =2 $$

Solving a system of m equations that are linear in n variables is equivalent to solving the matrix/vector equation $$Ax = b,$$ where $A\ \epsilon\ M_{m,n}(\mathbb F)$, the set of $m\times n$ matrices with entries from some field $\mathbb F$, usually $\mathbb R$ or $\mathbb C$, $x\ \epsilon \ \mathbb R^n$ representing the column vector of your variables, and $b\ \epsilon \ \mathbb R^m$ representing the column vector of the equalities from your system of equations.

In this case, solving the system of linear equations is equivalent to row-reducing the augmented matrix $[A|b]$. The look of that will tell you much about the system of equations: at a glance, you can see if (a) the system is inconsistent (i.e. no solutions), (b) the system has one unique solution (and then you can easily read off the values of the variables), or (c) the system has infinitely many consistent solutions.

As others have stated, vector spaces are an abstract concept, with results proved generally and then applied to many specific examples. Solving systems of linear equations is merely one of the earlier instances one would expect to encounter mathematical objects that can be more easily viewed from the perspective of a vector space.

(Edited: I really should have emphasized the linear nature of what we're doing here; that's why it's called linear algebra.)

0
On

One can show, for example, that the set of all solutions to a linear nth order differential equation forms an n dimensional vector space. Similarly the set of solutions to a linear partial differential equation forms an infinite dimensional vector space. Basically, vector spaces incapsulate the idea of a "linearity".

2
On

The reason to study any abstract structure (vector spaces, groups, rings, fields, etc) is so that you can prove things about every single set with that structure simultaneously.

Vector spaces are just sets of "objects" where we can talk about "adding" the objects together and "multiplying" the objects by numbers. Any set where we can do this (that also follows all of the rules that adding and scalar multiplying have to follow) is a vector space. There are tons of different examples of vector spaces and when we prove things about vectors space in general, we're proving things about all of those specific vector spaces at once.

So for instance, you'll learn about the Cauchy-Schwarz inequality for vector spaces (technically inner product spaces, but those are just a particular type of vector space) in linear algebra. It says that $$|v\cdot w| \le \|v\|\|w\|$$ where $v,w$ are vectors in our vector space and $\|v\|=(v\cdot v)^{1/2}$ is the norm of the vector $v$.

Then at some later time you'll discover that the set of all continuous functions defined on the interval $[a,b]$ is a vector space (because we can define addition and scalar multiplication on continuous functions) and that $f\cdot g = \int_a^b f(x)g(x)dx$ is an inner product -- the same type of thing as the dot product on $\Bbb R^n$. Knowing this we immediately get the result that $$\left| \int_a^b f(x)g(x)dx\right| \le \left(\int_a^b (f(x))^2 dx\right)^{1/2}\left(\int_a^b (g(x))^2 dx\right)^{1/2}$$ for free. We don't even have to prove it. This seemingly difficult (and interesting) inequality about integrals we got for free because we proved the CS inequality always holds for vector spaces (or at least the ones equipped with an inner product) including this one.