So like the title says. Suppose we have a continuous function $f:[0,1] \rightarrow \mathbb R$ such that $\int_0^1\sin(nx)f(x)dx=0 $. Why does this imply that $f=0$. I was thinking of appealing to Fourier series, but here's the problem.First of all the coefficients described above have the wrong scaling for sine. Now if we can appropriately rescale it then we know that since $f$ is continuous on $[0,1]$ its in $L^2$. And every fourier series for an $L^2$ function converges to that function in the $L^2$ sense. Now since the Fourier series is zero we see that $f$ has zero $L^2$ norm. By continuity we have that $f=0$.
So my question is really how can we rescale sine in the integral above so that it has the correct formula for the Fourier coefficients.
Use Fourier series on the interval $[0,2\pi]$ (extending the function $f$ to be $0$ on $(1,2\pi]$).