Let $\Omega$ be a bounded subset of $\mathbb R^d$ and let $g\in L^2(\Omega )$. Let $(\lambda_n)_{n\in \mathbb{N}}$ be the eigenvalues of the Laplacian operator $\Delta$, and $(e_n)_n$ the eigenvectors associated to the eigenvalues. There exist a unique $u=(u_t)\in C^0([0,+\infty [,L^2(\Omega ))$ that solves the problem $$\dfrac{\partial u}{\partial t}−\Delta u=0 ,\in D′(]0,+\infty [\times \Omega),\quad u_0=g,\quad u_t\in H^1_0(\Omega )$$ with $u_t=\sum^{+\infty }_{n=1}e^{−λ_nt}(g,e_n) e_n$
My question is how we prove that $$\dfrac{\partial u}{\partial t}−\Delta u=0 ,\in D′(]0,+\infty [\times \Omega) ?$$
i.e. For all $\psi \in C^{\infty}_c(I \times \Omega),$ $$\langle \dfrac{\partial u}{\partial t}-\Delta u ,\psi \rangle_{D',D} = 0$$
Thank's for the help
Let $u_n=e^{-\lambda_n t}(g,e_n)e_n$, so that $u=\sum_n u_n$. Clearly $(\partial/\partial t-\Delta)u_n=0$. Next notice that $\sum_n u_n$ does converge in $\mathcal D'(]0,\infty[\times\Omega)=:\mathcal D'$, as it converges in $L^2(]0,\infty[\times\Omega)$. Let $T=\partial/\partial t-\Delta$. The beauty of distributions is that $T$ (as any linear differential operator with smooth coefficients) is a continuous map $\mathcal D'\to\mathcal D'$, and thus $T(\sum_n u_n)=\sum_n (T u_n)=0$.
PS: Continuity is seen as follows ($S$ denotes the adjoint of $T$, here $S=-\partial/\partial t-\Delta$): if $f_n\to f$ in $\mathcal D'$ then $$\langle Tf_n,\phi\rangle=\langle f_n,S\phi\rangle\to\langle f,S\phi\rangle=\langle Tf,\phi\rangle.$$