Prove that:
If
$$\DeclareMathOperator{\Dm}{\,\Bbb d\!} \int\limits_{\Omega} [u_t + (f(u))_x ] \phi \Dm t \Dm x =0$$ for all $\phi \in C_0^\infty(\Omega)$, then it holds even for all $\phi \in C_0(\Omega)$. Here $u=u(t,x)\in C^1(\Omega)$ is the solution of equation $u_t +(f(u))_x =0$, $\Omega$ is a subset of $\Bbb R^2$ and $f \in C^1(\Bbb R)$.
Would anybody help me prove the latter assertion rigorously?
My idea is that I should use the fact that $C_0(\Omega)$ is a dense set in $C_0^\infty(\Omega)$ and maybe instead of $\phi$ in the integral I should put $\phi -\phi_{\epsilon} + \phi_{\epsilon}$, then $|\phi -\phi_{\epsilon}|<\epsilon$.
But I’m not really sure what can I do after that.
You have $$\int\limits_{\Omega} [u_t + (f(u))_x ] \phi \, dt \, d x =0$$ for all $\phi \in C^\infty_0(\Omega)$. Now if $\phi \in C_0(\Omega)$, for each $\epsilon >0$, there is $\phi_\epsilon \in C^\infty_0(\Omega)$ with $|\phi (x) - \phi_\epsilon(x)|<\epsilon$ for all $x\in \Omega$. Also, one can choose $\phi_\epsilon \in C^\infty_0(\Omega')$, where $\overline\Omega' \subset \Omega$ is independent of $\epsilon$. Then \begin{align} \left| \int _\Omega [u_t + (f(u))_x ] \phi \, dt \, d x\right| &= \left| \int_{\Omega'} [u_t + (f(u))_x ] (\phi -\phi_\epsilon) \, dt \, d x\right| \\ &\le \left| \int_{\Omega'} [u_t + (f(u))_x ]\, dt \, d x \right| \epsilon \le M\epsilon. \end{align} where $$M = \int_{\Omega'} |u_t + (f(u))_x |\, dt\, dx.$$ Since $\epsilon>0$ is arbitrary,
$$\int_\Omega [u_t + (f(u))_x ] \phi \, dt \, d x = 0. $$