I have this excercise. I am able to solve it, but the problem is that I can solve it without using the last part of information of the existence of the u-vector. That makes me afraid that my proof is false, because it is a much more general statement, do you guys see where my error is?

I prove it is complete by proving that every Cauchy-sequence converges. So assume that $\{\textbf{x}_i\}$ is a cauchy sequence. Since we have a basis, every $\textbf{x}_i=\Sigma_{k=1}^\infty a_k^i v_k$.
The proof will take two steps.
Step one: prove that for a given K, $a_K^i$ is cauchy, and since $\mathbb{R}$ is complete, the converge to $a_K$.
Step two: Now we have a candidate vector $\Sigma_{k=1}^\infty a_k v_k$. And I prove that our original Cauchy sequence converges to this vector.
Step 1: Start with the Cauchy-sequence $\{\textbf{x}_i\}$. Since this is cauchy, we can get this quantay as little as we want: $<\Sigma_{k=1}^\infty a_k^i v_k-\Sigma_{k=1}^\infty a_k^j v_k,\Sigma_{k=1}^\infty a_k^i v_k-\Sigma_{k=1}^\infty a_k^j v_k>^{0.5}=(\Sigma_{k=1}^\infty(a_k^i-a_k^j)^2)^{0.5}$. Now it is easy to see that for a given K, we have a cauchy-sequence in $\mathbb{R}$. To get this I used that if I have two different sequences convering, then their difference also converges. And I used continuity of the inner-products, so I can work with finite sums, and then let the limit go to infinity. When I worked with the finite sums I also used the orthonormality.
Anyway, since we now have that the a's converges we can go to step 2.
Step 2 We now have the candidate function $\Sigma_{k=1}^\infty a_k v_k$. And we show that $\Sigma_{k=1}^\infty a_k^i v_k$ converges to this function as i goes to infinity.
$<\Sigma_{k=1}^\infty a_k^i v_k-\Sigma_{k=1}^\infty a_k v_k,\Sigma_{k=1}^\infty a_k^i v_k-\Sigma_{k=1}^\infty a_k v_k>^{0.5}=(\Sigma_{k=1}^\infty(a_k^i-a_k)^2)^{0.5}$. To see that the last expression goes goes to zero, observe that for any M: $(\Sigma_{k=1}^M(a_k^i-a_k)^2)^{0.5}$, goes to zero,because every term goes to zero, since this holds for every M, it also holds in the limit.
So, where is the mistake? If there is no mistake that must mean that every inner product space with an orthonormal basis is complete, is that true?
EDIT UPDATE: Is it possible that $\Sigma_{k=1}^\infty a_kv_k$, may not even be in the vector space? I thought that every linear combination of the basis-vectors where in the vector space, but maybe this holds only when there is a finite number of basis-vectors. And maybe not if there is an infinite number? So that this is where the extra condition comes into play? And if we do not use the condition, there is no way to prove that the candidate vector indeed is in the vector space?
Your mistake is in Step $2$. Whilst it is true that for any $M$, $(\Sigma_{k=1}^M(a_k^i-a_k)^2)^{1/2}$ goes to $0$ as $i$ goes to infinity, you cannot infer that the limit of the infinite sum tends to zero as $i$ tends to infinity.*
I think the flaw lies in assuming that for a sequence $a_{j,n}$ in two variables, $\lim_{j \rightarrow \infty} a_{j,n} = 0$ for each $n$ implies that $\lim_{j\rightarrow \infty} \lim_{n \rightarrow \infty} a_{j,n} = 0$, which is not true in general (it is true in some cases though, if there is some kind of uniform convergence). To see why this is not true, take the sequence $a_{j,n} = 1$ if $n > j$ and $0$ otherwise.
*If you take the limit in $M$ after taking the limit in $i$, it looks like we get what we want. However, you need to take the limit in $M$ before taking the limit in $i$ to get what you want from this argument, and this isn't allowed.