Hille Yosida theorem application

1.9k Views Asked by At

Disclaimer: pretty long and specific (contraction semi groups involved).

I have fourth order parabolic equation $$ u_t + \Delta^2 u = 0 $$ on $U_T = U \times [0,T]$. $U \subset \mathbb{R}^m$ is a bounded open set with smooth boundary. Boundary conditions are: $$ u(x,0) = g \in L^2(U) $$ and $$ u=\frac{\partial U}{\partial n}=0 \quad \text{on } \partial U \times [0,T] $$ I would like to prove the existence of a weak solution. From Hille Yosida theorem i deduce that weak solution exists if the operator $-\Delta^2$ generates a contraction semigroup (is this ok?): To prove this i define (is this ok?) $$ D(-\Delta^2) = H^4(U) \cap H_0^2 (U) $$ Density:

$C_0^{\infty}(U) \subset H^4(U) \cap H_0^2 (U)$. $C_0^{\infty}(U)$ is dense in $L^2(U)$ and thus $D(-\Delta^2) $ is dense in $L^2(U)$ (is this ok?)

Closedness:

$\{u_k\}_k^{\infty} \subset D(-\Delta^2)$ with \begin{align*} u_k & \to u \\ -\Delta^2 u_k & \to f \end{align*} in $L^2(U)$ when $k \to \infty$. I have $$ ||u_k - u_l||_{H^2(U)} \leq C(||-\Delta^2 u_k + \Delta^2 u_l||_{L^2(U)} + ||u_k - u_l||_{L^2(U)}) $$ and then $u \in D(-\Delta^2)$ and $-\Delta^2 u =f$

$\lambda \in \mathbb{R}$ belongs to resolvent set $\rho (-\Delta^2)$ if operator $\lambda I + \Delta^2: D(-\Delta^2) \to L^2(U)$ is one-to-one and onto.

For $\lambda \in \rho (-\Delta^2)$, the resolvent operator $R_{\lambda}: L^2(U) \to L^2(U)$ is defined by $R_{\lambda}u = (\lambda I + \Delta^2)^{-1} u $.

I have to prove also

$(0,\infty) \subset \rho (-\Delta^2)$:

I show that equation $\lambda u + \Delta^2 u = f$ has unique weak solution for $\lambda > 0$ and assume that is also regular (can i do this?) Now $\lambda I + \Delta^2$ is one-to-one and onto for $\lambda > 0$ and thus $(0,\infty) \subset \rho (-\Delta^2)$.

Last thing to prove

$||R_{\lambda}||_{L^2(U)} \leq \frac1{\lambda}$:

Weak solution satisfies $$ \lambda \int_U uv dx + \int_U \Delta v \Delta u dx = \int_U fv dx $$ for all $v \in H_0^2(U)$. I set $u=v$ to get $$ \lambda \int_U u^2 dx + \int_U (\Delta u)^2 dx = \int_U fu dx $$ From here $$ \lambda \int_U u^2 dx = \lambda ||u||_{L^2(U)}^2 \leq \int_U fu dx \leq ||f||_{L^2(U)} ||u||_{L^2(U)} $$ and $$ ||u||_{L^2(U)} \leq \frac{1}{\lambda} ||f||_{L^2(U)} $$ Acknowledging $$R_{\lambda}f = u$$ we get $$ ||R_{\lambda}|| \leq \frac{1}{\lambda} $$ as desired.

Is this close to ok? Help warmly accepted.