Finding orthonormal basis of subspace spanned by two functions

1.7k Views Asked by At

From S.L Linear Algebra:

Let $V$ be the subspace of functions generated by the two functions $f$, $g$ such that $f(t)=t$ and $g(t)=t^2$. Find an orthonormal basis for $V$.

In this case, $V$ is considered as space of continuous real-valued functions on the interval $[0, 1]$. Scalar product of two functions $f$ and $g$ is defined as:

$$⟨f, g⟩=\int_{0}^{1} f(t)g(t) \, \, dt$$

where $t \in [0, 1]$.

I have showed that the integral above is a valid scalar product by displaying it as Riemann sum approximation and verifying properties according to it (but this is not important here).


Problem:

I know that I could use Gram-Schmidt orthogonalization process to ensure that two elements $f(t)=t$ and $g(t)=t^2$ mutually orthogonal as such:

$$t - \frac{\int_{0}^{1} t^3 \, \, dt}{\int_{0}^{1} t^2 \, \, dt}t = t - \frac{\frac{1^4}{4} - \frac{0^4}{4}}{\frac{1^3}{3}-\frac{0^3}{3}}t=t-\frac{3}{4}t=\frac{t}{4}$$

And then divide them by norm to ensure orthonormality:

$$s=(\frac{t}{\sqrt{\int_{0}^{1} t^2 \, \, dt}}, \frac{\frac{t}{4}}{\sqrt{\int_{0}^{1} \frac{t}{4}^2 \, \, dt}})$$

(I apologize if I made any mistakes in calculations).

But now the problem is to decide whether every element in $s$ is linearly independent, if we consider homogeneous equation:

$$c_1\frac{t}{\sqrt{\int_{0}^{1} t^2 \, \, dt}}+c_2\frac{\frac{t}{4}}{\sqrt{\int_{0}^{1} \frac{t}{4}^2 \, \, dt}}=0$$

And $s$ is a valid orthonormal basis for $V$, then $c_1=c_2=0$. But I don't know how to ensure linear independence between two vectors.

How can I finish the proof by showing linear independence?

1

There are 1 best solutions below

0
On BEST ANSWER

To give a little explanation on my comment: I'm doing this in a toy setting and you can easily extend this further. Let $(V,\langle\cdot,\cdot\rangle)$ be a real inner product space with $\mathrm{dim}(V)\geq 2$.

Take $v,w\in V\setminus\{\mathbf 0\}$ (as otherwise they are clearly not linear independent) and suppose that they are orthogonal, i.e. that $\langle v,w\rangle=0$. Now, suppose $\alpha v+\beta w=\mathbf 0$. We want to show that then $\alpha,\beta=0$.

To see this, first consider that $\langle \alpha v+\beta w,v\rangle=0$ as $\alpha v+\beta w$ is the null vector. By linearity in the first argument, symmetry and orthogonality, we have

$$0=\alpha\langle v,v\rangle+\beta\langle w,v\rangle=\alpha\langle v,v\rangle+\beta\langle v,w\rangle=\alpha\langle v,v\rangle$$

As $v$ is not the null vector, $\langle v,v\rangle>0$ by positive definiteness and thus $\alpha=0$.

The same argument also leads us to $\beta=0$ by considering $\langle\alpha v+\beta w,w\rangle$.

Thus, the vectors $v,w$ are linearly independent.