The problem came from my computation methods (practice) class. It was to write a program which does the following:
Original problem statement:
We have a [0; 1] segment. Let us divide it into $2^n$ equal parts. Let us define the set of points of division as the set of points $\{x | x= k2^{-n}, 0\le k \le 2^n , k \in \mathbb{Z} \}$. For each small segment (on which we divided the original) we construct 2 "functions of the form". They are linear and takes 0 in one end point of a segment and 1 in another, outside of a segment it's 0. Also we construct $2^n +1$ basic functions for each point of division: it's a sum of all "form functions" which are taking value 1 at that point. So if our point is not 0 or 1 basic function "looks like" an isosceles triangle. For 0 or 1 it is looking like a half of it.
Let us name basic functions $\{g_i; i = 1 .. 2^n+1 \}$. Now we have a function $f = x e^x$ and we are looking for it's best approximation of the form $\sum a_i g_i$ in $L_2$. It means that we are looking for such set of $g_i$ that $\Delta^2 := \int ^1 _0 (f(x) - \sum c_i g_i (x) )^2 dx$ is the smallest possible.
We know that the vector of $c_i$ satisfies $A c = b$, where A is a gramian of $g_i$: $a_{ij} := \int ^1 _0 g_i g_j dx$.
$b_i := \int ^1 _0 g_i f dx$
In my program I should have almost (machine-$\epsilon$ problems) exact representation of A-matrix. For $b$ I'm forced to use Simpson-rule approximation of each $b_i$.
Now the problem with that is $\Delta ^2 := \int ^1 _0 (f(x) - \sum c_i g_i (x) )^2 dx = (f,f) - 2(b, c) + (Ac, c)$ is negative!! And I believe it's a consequence of using the Simpson's rule for b. My instructor doesn't think it's right, and he is not interested in a program where I'm using an exact integral value for b. So I should prove him that he couldn't use SR here.
My question is how can I prove that $\Delta ^2$ computed with SR could be negative.
$c = A^{-1} b$
$\Delta ^2 = (f,f) -2 (b,c)+(Ac, c) = (f, f) - (b, A^{-1} b)$