Find a distance from vector $h(t)=t$ to the subspace $$M=\left\{x(t)\in L^2(0,\pi):\int_o^{\pi}x(t)\cos t \, dt=0\right\}.$$
So, it's a basic exercise, but I'm a bit confused solving that. Can someone explain how to do it? I know that by definition $$\operatorname{dist}(h,M)=\inf\|t-x(t)\|,$$ but this is a conditional extremum problem. Can I solve it using Fourier coefficients, we know that any function in a Hilbert space we can approximate of linear combination of basic elements?
I hope that someone explain how to find a distance using the last way or may be there are exist more faster ways.
Suppose you represent y(t) as x(t)+f(t), where x(t) is in M. Then the distance from y(t) to x(t) is f * f (note: I’m ignoring the square root because it’s irrelevant for optimization purposes). Setting the derivative to zero, 2f * df = 0. If y(t) is fixed and x(t) can vary, then x(t) can vary only within M, and thus dx is in M. Since f = y-x, and y is fixed, df = -dx, and thus df is in M as well. So f is a vector such that f*(elements of M) = 0. Intuitively, the distance from a point to a surface is the length of the normal vector from the surface to the point (a normal vector from the surface to the point is not in general unique, but with a hyperplane it is).
You are already given that cos(t) * (every element of M) = 0. With a little hand waving, it follows that f = a cos(t) for some scalar a, and so the distance from y to M is a*length of cos(t). a is y * cos/(cos*cos), so the distance is y * cos/||cos||.