I've been doing some excersices about inner product and I found something interesting but I don't know if my approach is correct at all.
Supose that ${v_{1}, v_{2}, ..., v_{n}}$ is a base for a vector space V over $\mathbb{R}$, with a real inner product $<. , .>$, then for any set of ${r_{1}, r_{2}, ..., r_{n}}∈R$ there exists an unique $w∈V$ such that: $<v_{i}, w> = r_{i}$
I started proving the uniqueness: supose that there exists $w,u∈V$ such that $<v_{i}, u> = <v_{i}, w> = r_{i}$ for every $i=1,2,...,n$ and given ${r_{1}, r_{2}, ..., r_{n}}∈R$. Since ${v_{1}, v_{2}, ..., v_{n}}$ is a base for V, we can write:
$u = a_{1}v_{1} + a_{2}v_{2} + ... + a_{n}v_{n}$
$w = b_{1}v_{1} + b_{2}v_{2} + ... + b_{n}v_{n}$
For uniques $b_{1},a_{1},b_{2},a_{2},...,b_{n},a_{n}∈\mathbb{R}$
It is easy to see that, $<v_{i}, u - w> =0$ for $i =1,2,...,n$ and if $z∈V$ is a non zero vector, I found that $<z, u - w> = 0$ and therefore $u-w = 0$
Now, we have to prove the existence. We can write the hypothesis this way: let $A =( a_{i,j} )$ be an nxn matrix such that $a_{i,j} = <v_{i},v_{j}>$ and let $ \begin{align} r &= \begin{bmatrix} r_{1} \\ r_{2} \\ \vdots \\ r_{n} \end{bmatrix} \end{align}$
And consider the $Ax = O$ system of equations, since it has a solution $x = O$, by the previous proof, $x = O$ is the unique solution and therefore, the columns of A are linearly independent, since A has n columns then these columns span $\mathbb{R}^n$.
So, we can write $r$ as a linear combination of the columns of A, and that is, having a solution for $Ax = r$, then we write the explicit form of the equations and use the bilinear property of $<.,.>$ to find out that the components of $x$ are the components of the $w$ we wanted to find.
My question is: if we supose that $Ax = b$ has unique solutions for any $b$, and $A$ is an nxn matrix, then $Ax = b$ has a solution for any $b∈\mathbb{R}^n$. Is that correct?
That's correct. If $Ax=b$ has a unique solution for any $b$, then the columns of $A$ are linearly independent. If they were linearly dependent then, on the one hand, the columns of $A$ wouldn't span all of $\mathbb{R}^n$ and there would be a solution for every $b$; on the other hand, for those unlikely $b$ for which there is a solution, the solution would not be unique since the columns are linearly dependent and therefore there exists a nontrivial linear combination of those columns that equals zero.
So the columns of $A$ are linearly independent and now we can state the converse: when the columns of $A$ are linearly independent, they span the entire $\mathbb{R}^n$ and each element of $\mathbb{R}^n$ is represented uniquely.