Identifying common features of all elements in an orthogonal complement in a vector space of polynomials

65 Views Asked by At

Let $V$ be the vector space of polynomials in the variable $t$ of degree at most $2$ over $\mathbb{R}$. An inner product on $V$ is defined by $$\langle f,g \rangle = \int_0^1 f(t)g(t)\,dt .$$ Let $$W=\operatorname{span}\{1+t^2,1-t^2\}$$ and $W^{\perp}$ be the orthogonal complement of $W$. Which of the following conditions is satisfied for all $h\in W^{\perp}$?

(a) $h$ is an even function.

(b) $h$ is an odd function.

(c) $h(t)$ has a real solution.

(d) $h(0)=0$.

We can take any element of $W$ as $$w=c_1(1+t^2)+c_2(1-t^2)$$ where $c_1,c_2\in \mathbb{R}$. Now if we take $h(t)\in W^\perp$, we have $$\langle w,h \rangle =\int_0^1 [c_1(1+t^2)+c_2(1-t^2)]h(t)\,dt=0$$ which in turn implies $$\int_0^1 (1+t^2)h(t)\,dt=0$$ and $$\int_0^1 (1-t^2)h(t)\,dt=0$$ for nonzero $c_1,c_2$. Now clearly if $h(t)$ is even/odd function, the above equalities do not hold. So $(a),(b)$ are discarded. Afterwards I am unable to check whether $h(t)$ will have real solution or it will vanish at $0$ using the above equalities. Is the solution is in right track? Any help is appreciated.

2

There are 2 best solutions below

0
On

I think answer might be $(c)$. I'm gonna cite you my approach, it's not the most efficient way i think but this should work.

As we know from theory $\mathbb{R}_{2}[x]$ is isomorphic to $\mathbb{R}^{3}$ thanks to dimensions.

Since $dim \hspace{0.1cm} Span(1+t^{2},1-t^{2}) = 2$

(Formerly you can take the isomorphism of coordinates and see if the columns are indipendent which are, because respect to canonical basis $(e_{1},e_{2},e_{3})$the coordinates $\begin{pmatrix}1 \\ 0 \\ 1\end{pmatrix},\begin{pmatrix}1 \\ 0 \\ -1\end{pmatrix}$ contain the minor $\begin{pmatrix} 1 & 1 \\ 1 & -1 \end{pmatrix}$ which has rank 2).

Knowing that we can ''move'' the problem choosing a basis $B$, from $\mathbb{R_{2}}[x]$ to $\mathbb{R}^{3}$,because the associated matrix of the scalar product will be more familiar to us, since we just have to compute $x^{T}M_{B}(<,>)y$.

Let's decompose $\mathbb{R_{2}}[x] = W \bigoplus span(t)$ which corresponds to $\mathbb{R}^{3} = span(\begin{pmatrix}1 \\ 0 \\ 1\end{pmatrix},\begin{pmatrix}1 \\ 0 \\ -1\end{pmatrix}) \bigoplus span(e_{2})$.

Integrating polynomials of the basis we are able to find all the possible $<f,g>$ which lead us to the associated matrix

(There could be errors on calculations, you should check,that's why I'm not sure, I'm writing it down in any case the matrix just for the purpose of the explanation) $$M_{B}(<,>) = \begin{pmatrix} \frac{28}{15} & \frac{4}{5} & \frac{5}{6} \\ \frac{4}{5} & \frac{8}{15} & \frac{1}{6} \\ \frac{5}{6} & \frac{1}{6} & \frac{1}{3}\end{pmatrix}$$

We have almost done. We know that $W = span \hspace{0.1cm} \{\begin{pmatrix}1 \\ 0 \\ 1\end{pmatrix},\begin{pmatrix}1 \\ 0 \\ -1\end{pmatrix}\}$ so our generic vector $v = \begin{pmatrix} a \\ b \\ c \end{pmatrix} \in span(W^{\perp})$ will satisfy $<v,w> = 0 \hspace{0.1cm} \forall w \in W$. Since this condition it sufficient prove it on a basis of $W$, we obtain :

$$<v,\begin{pmatrix}1 \\ 0 \\ 1\end{pmatrix}> = (a,b,c) \cdot M_{B}(<,>) \cdot \begin{pmatrix}1 \\ 0 \\ 1\end{pmatrix} = 0$$

$$<v,\begin{pmatrix}1 \\ 0 \\ -1\end{pmatrix}> = (a,b,c) \cdot M_{B}(<,>) \cdot \begin{pmatrix}1 \\ 0 \\ -1\end{pmatrix} = 0$$

Whiche leads to a linear system, in this case $$\begin{cases}\frac{81a}{30} + \frac{29b}{30}+\frac{7c}{6} = 0 \\ \frac{31a}{30} + \frac{19b}{30} + 2c = 0\end{cases}$$

Which we hope will have solution $a,b,c \in \mathbb{R}$. Once you determined $a,b,c$ you deermine univoquely $v \in W^{\perp}$ and you could came back noticing that once you know $a,b,c$ a generical $v \in W^{\perp}$ will take the form of $at^{2}+bt+c$ which i'm sure you pretty easily solve wether is answer $a,b,c,d$.

Our inconvenient as you might see is to determine the associated matrix to the scalar product;

Besides it wasn't particularly great for calculations.

One thing we might do in general to facilitate this process is finding the definiteness of you matrix and find a basis where the matrix is in its normal form.

I'm sorry if this is not really the answer you were looking for but i hope i get you a more general picture to be able to reuse this notions in the future.

0
On

Hint Suppose $h \in W^{\perp}$. Then, since $1 \in W$ we have $$0 = \langle 1, h \rangle = \int_0^1 h(t) \,dt .$$

Now, apply the Mean Value Theorem.

Here are less computationally involved ways to approach (a), (b):

(a) The space $W = \operatorname{span}\{1, t^2\}$ already consists of all of the even polynomials in $V$, and so (a) would imply $W^\perp \subset W$.

(b) The set of odd polynomials in $V$ is spanned by $t$, but $\langle 1 , t \rangle = \int_0^1 t \,dt = \frac{1}{2} \neq 0$