Temperature bounds in heat transfer PDE

50 Views Asked by At

I have a heat transfer problem with solution in an arbitrary domain $\Omega$ given by \begin{equation}T(x,t)=T_{\infty}(x)+\sum_{n=0}c_n X_n(x) e^{-\lambda_n t} \end{equation} where $T_{\infty}(x)$ is the steady-state solution, $X_n(x)$ and $\lambda_n$ are the Laplacian eigen-functions and eigen-values, and $c_n$ are coefficients obtained from the initial condition $T_0(x)$ such that: \begin{equation}\sum_{n=0}c_n X_n(x) =T_0(x) - T_{\infty}(x) \end{equation} If we know that $T_{\infty}(x) > T_{0}(x) \quad \forall x \in \Omega$, is it possible to show that the temperature evolution in time will always be bounded by the initial and steady state temperatures? \begin{equation} T_{\infty}(x) \geq T(x,t) \geq T_0(x) \quad \forall t \in [0,\infty)\end{equation}

Edit 1: The original problem is the following: \begin{equation} c\frac{\partial T}{\partial t}=\nabla \cdot (k \nabla T) + g(x) \\ T(x,0)=T_0(x) \\ T=T_D(x)\;\mathrm{on}\;\Gamma_D \quad -k\nabla T \cdot n=q_N \; \mathrm{on} \; \Gamma_N \quad k\nabla T \cdot n + h(T-T_f(x)=0 \; \mathrm{on} \; \Gamma_R \end{equation} where $g(x)$ is a heat source. $T_{\infty}(x)$ solves the steady-state part of the temperature including the time-independent heat source and non-honogeneous boundary conditions. \begin{equation} -\nabla \cdot (k \nabla T_{\infty}) = g(x) \\ T_{\infty}=T_D(x)\;\mathrm{on}\;\Gamma_D \quad -k\nabla T_{\infty} \cdot n=q_N \; \mathrm{on} \; \Gamma_N \quad k\nabla T_{\infty} \cdot n + h(T_{\infty}-T_f(x))=0 \; \mathrm{on} \; \Gamma_R \end{equation}