Why is $1,x,x^2,x^3,\cdots$ a bad basis?

172 Views Asked by At

For functions of $x$, the first basis I would think of contains the powers $1, x, x^2 , x^3 , ...$. Unfortunately this is a terrible basis. Those functions $x^n$ are just barely independent. $x^{10}$ is almost a combination of other basis vectors $1, x, ... , x^9$. It is virtually impossible to compute with this poor "ill-conditioned" basis.

This is said in Page 426, Chapter 8-Linear Transformations, Introduction to Linear Algebra, Gilbert Strang, 5th Edition.

Why do we say that $1,x,x^2,x^3,\cdots$ is a bad basis ?

Taking the Wronkian $$ \begin{vmatrix}x&x^2\\1&2x\end{vmatrix}=2x^2-x^2=x^2\\ \begin{vmatrix}x^2&x^3\\2x&3x^2\end{vmatrix}=3x^4-2x^4=x^4\\ \begin{vmatrix}x^3&x^5\\3x^2&5x^4\end{vmatrix}=5x^7-3x^7=2x^7 $$

Looks like linear independence is almost fine. So what does it mean to say it an ill-conditioned basis ?

And how does one show that "those functions $x^n$ are just barely independent" ?

2

There are 2 best solutions below

5
On BEST ANSWER

Just to add to angryavian's answer, if you are working in the space of functions defined over say, $[-1, 1]$, then what Strang likely refers to is the fact that the projection $p(x)$ of $x^{10}$ onto the space spanned by $\{1, x, x^2, \cdots, x^9\}$ is quite close to $x^{10}$ itself in a least-squares sense, namely that $$\|p(x) - x^{10}\|_{L^2} = \int_{-1}^1 \left|p(x) - x^{10}\right|^2 dx$$ is small. We don't even need to calculate $p(x)$ exactly to verify this, because we can observe that even $x^8$ is a decent approximation to $x^{10}$ on the interval $[-1,1]$, as $$\|x^8 - x^{10}\|_{L^2} \approx 0.0023$$ and we know that the square distance between $p(x)$ and $x^{10}$ is upper bounded by this value. And indeed, the distance between $x^k$ and the least squares approximation of $x^k$ restricted to the subspace spanned by $\{1, x, x^2, \cdots, x^{k-1}\}$ will tend to zero as $k \to \infty$. Intuitively, this means that the higher and higher $k$ goes, the closer and closer you get to adding a linearly dependent vector to your set.

A remark: even though I assumed for the sake of example that the polynomials were defined over the range $[-1, 1]$, this can be extended to any finite interval. Furthermore, this concept is not restricted to the canonical inner product $\langle f, g \rangle = \int f \bar g dx$. I suspect that under many inner products Strang's statement may hold in the way I've described.

2
On

The vectors $(1,0,0), (1,0.01, 0), (1,0.01, 0.01)$ are linearly independent in $\mathbb{R}^3$, but are very close to each other.

Strang may be saying that the monomials $1,x,x^2,\ldots$ are not close to "orthogonal." Other bases like the Chebychev polynomials are orthogonal with respect to the $L^2$ inner product.