I have two i.i.d standard normal variables $\epsilon_0, \epsilon_1 \sim \mathcal{N}(0,1)$.
Let $$ U_0 = \epsilon_0 $$ $$ U_1 = \epsilon_0 e^{-\lambda} + (1-e^{-2\lambda})^\frac{1}{2}\epsilon_1 $$ where $\lambda$ is positive almost everywhere random variable that independent from $\epsilon_0$ and $\epsilon_1$.
I need to understand whether $U_0$ and $U_1$ are jointly normal.
I have the following aruments, are they correct?
Two variables are jointly normal if $\forall \alpha, \beta \in \mathbb{R}$ the $X:=\alpha U_0 + \beta U_1$ is normally distributed:
$$
X =\epsilon_0(\alpha + \beta e^{-s}) + \epsilon_1 \beta (1 - e^{-2\lambda})^\frac{1}{2}
$$
Let's write bayes formula for PDF of $X$:
$$
f_X(t) = \int \limits_{-\infty}^{+\infty} f_{X \mid \lambda = s}(t \mid s) d F_\lambda(s)
$$
When $\lambda = s$ we have the sum of two i.i.d standard normal variables with some coefficients so it's normal variable $\mathcal{N}(0, \sigma_s^2)$ where $\sigma_s$:
$$
\sigma_s^2 = (\alpha + \beta e^{-s})^2 + \beta^2 (1 - e^{-2\lambda}) = \alpha^2 + 2\alpha \beta e^{-s} + \beta^2
$$
This value is different for different $s$ so if we take simple random variable $\lambda$ which takes values $\{0, 1\}$ with probabilities $\dfrac{1}{2}$ for both of them then we have PDF of $X$:
$$
f_X(t) = \dfrac{1}{2\sqrt{2\pi}}\left(\dfrac{e^{-\tfrac{t^2}{2\sigma_0^2}}}{\sigma_0} + \dfrac{e^{-\tfrac{t^2}{2\sigma_1^2}}}{\sigma_1} \right) \neq \dfrac{1}{\sqrt{2\pi}\sigma}e^{-\tfrac{(t- \mu)^2}{2\sigma^2}} \quad \forall \sigma \geqslant 0, \mu \in \mathbb{R}
$$
Where inequality holds as the different powers of exponent are linearly independent. So $f_X(t)$ can't be PDF of some normal random variable and $U_0$, $U_1$ aren't jointly normal.
Am i correct?
My main concern is about this inequality must holds on some set with non zero Lebesgue measure otherwise CDF would be insensitive on some changes on zero measure.