Is it possible to show the following or something similar by adjusting the assumptions a little bit? Let $B \subset \Omega \subset \mathbb{R}^3$, $B$ open and nonempty, $\Omega$ bounded $f_n:[0,T]\times\Omega \rightarrow \mathbb{R}^3$ a sequence of mappings of the form $f_n(t,x)= v_n(t) + A_n(t)x$ where $v_n(t)$ is a vector in $\mathbb{R}^3$ and $A_n(t)$ a matrix in $\mathbb{R}^{3\times3}$. Further assume
\begin{equation} \int_0^T \sup_{x \in B} |f_n(t,x)|^2dt < c_1. \end{equation}
Then we also have
\begin{equation} \int_0^T \sup_{x \in \Omega} |f_n(t,x)|^2dt < c_2, \end{equation}
where both $c_1,c_2$ are constants independent of $n$. At first I thought this would be trivial. Since $v_n$ and $A_n$ are independent of $x$ the integrability wrt $t$ should remain true if we extend $B$ to $\Omega$, as $x$ is constant wrt to this integration. The supremum also should not cause any problems, since $\Omega$ is bounded. But it might happen something like $v_n(t) = -A_n(t)x$ for $x \in B$, so that we can't say anything about the case $x \in \Omega$ and I don't know how to prevent this. I tried writing
\begin{equation} \int_0^T \sup_{x \in \Omega} |f_n(t,x)|^2dt = \int_0^T \sup_{x \in \Omega} \left(|v_n(t)|^2 + 2\langle v_n(t), A_n(t)x \rangle + |A_n(t)x|^2 \right) dt. \end{equation}
If $\langle v_n(t), A_n(t)x \rangle$ was non-negative, the statement should follow, but as we can't guarantee this, the terms in the integral might again cancel each other for $x \in B$. On the other hand, I can't think of a counter example for the assertion. Can anyone help me here? Thank you in advance.