Given any real basis for the polynomials of degree n-1, $\{N_1(x), ..., N_n(x) \}$, and the inner product given by $\langle f,g\rangle = \int^1_0 f(t)g(t) \, dt$, show that the matrix with elements $\langle N_i,N_j\rangle$ is positive definite.
What I have done is to decompose N into the outer product $AA^\top $ ,where A is the vector consisting of the basis functions, that is $A=\matrix{N_1\\\vdots \\N_n}$.
Then showing $x^\top AA^\top x > 0$ by using the norm.
However, I read somewhere that all matrices that are made from outer products have rank 1, and this would contradict N being positive definite, since it is singular.
Does this mean that the outer product does not equal the matrix, or is there another explanation to this apparent contradiction?
Thanks!
If $g$ is an inner product and $e_1,\ldots, e_n$ a basis, then the matrix $N_{ij}:=g(e_i,e_j)$ is positive definite, because for a nonzero vector $v=\sum_j x_j e_j$ represented via a row-vector $x=(x_1,\ldots, x_n)$, you have $$ x N x^T=\sum_{i,j} x_i g(e_i,e_j) x_j=\sum_{i} x_i g(e_i, \sum_j x_j e_j)=\sum_i x_i g(e_i, v)=g(v,v)>0. $$
Formally, the product of your $A$ (as a column vector) and $A^T$ doesn't make sense, you seem to be mixing matrix multiplication and inner product multiplication. Note that your "matrix" $A$ doesn't consists of numbers.
Decomposing such $N$ into a product of matrices $A A^T$ is actually possible only if you already know that your $N$ is positive (semi-)definite and symmetric. Just a remark.
Not sure what you mean by "all matrices that are made from outer products have rank 1".