Consider 2 random variables $X_1$ and $X_2$:
- $X_1 = Y_1 \sqrt{\alpha + \beta {X_0}^2}$
- $X_2 = Y_2 \sqrt{\alpha + \beta {X_1}^2}$
$ \alpha>0 $ , $\beta \ge 0$
where $Y_1$ and $Y_2$ are independent and both follow a normal standard distribution $N (0, 1)$. Also, $Y_1$ is independent of $X_0$ and $Y_2$ is independent of $X_1$.
Find the correlation between $X_1$ and $X_2$.
Based off these informations, I know that $X_1 | X_0 = x_0$ ∼ $N(0, \alpha + \beta x_0^2)$ and $X_2 | X_1 = x_1$ ∼ $N(0, \alpha + \beta x_1^2)$. Also, the equation for $X_2$ tells me that $X_1$ and $X_2$ are positively correlated. I thought of applying this formula:
- $Cov(X_1,X_2) = E[Cov(X_1,X_2|Z)] + Cov[E(X_1|Z),E(X_2|Z)]$
but I tried using $Z=X_0$ and $Z=X_1$ it seems like it doesn't work. Am I choosing the wrong formula?
Too long for a comment.
Did you copy the exercise correctly? It seems you are missing pieces of information, namely the joint distribution of $(Y_1, Y_2, X_0).$ Note that you are constructing $(X_1, X_2) = u(Y_1, Y_2, X_0)$ where $u(y_1,y_2,x_0) = \left( y_1\sqrt{\alpha + \beta x_0^2}, y_2\sqrt{\alpha + \beta y_1^2(\alpha + \beta x_0^2)} \right).$ With the independence assumptions, you are specifying the distributions of $(Y_1, Y_2)$ and $(Y_1, X_0)$ and $(Y_2, X_1).$ I am not sure if this is enough to specify the distribution of $(Y_1, Y_2, X_0)$ but seems not.
Notice that if you have a random vector $(X, Y, Z)$ and you specify $(X,Y),$ $(Y,Z)$ and $(X,Z),$ what you are doing is saying what are the projections of the distribution to each pair of axes but this does not specify the distribution itself with a few exceptions. For instance, if you say that $G = (X,Y,Z)$ is Gaussian, then you restrict the shape of the disitribution of $G$ (from any to Gaussian) and knowing the projections is enough; other instance where knowing the projections is enough is when you specify the distribution of $G$ to be the of the form $F_X(x)F_Y(y)F_Z(z),$ and then provide $F_X, F_Y$ and $F_Z.$ But in general, 2-argument projections of a 3-argument function does not suffice to know the 3-argument function (and in general, where $(2,3)$ is changed to $(k, k+p)$).
Ammend: you are correct in saying that $X_1 \mid X_0 = x_0$ and $X_2 \mid X_1 = x_1$ are normal random variables. What you are not correct is that the are positively correlated (heuristic justification: if $X$ is a symmetric variable around its mean, then $X$ and $X^2$ have zero correlation). Let me show you this, for that I will assume $Y_1$ is jointly independent of $X_0, Y_2$. Then, $$ \mathbf{Cov}(X_1 X_2) = \mathbf{E}(X_1 X_2) - \mathbf{E}(X_1) \mathbf{E}(X_2). $$ In $\mathbf{E}(X_1)$ condition on $X_0$ to reach $\mathbf{E}(X_1) = 0.$ On $\mathbf{E}(X_1 X_2)$ condition on $(X_0, Y_2),$ then $$ \begin{align*} \mathbf{E}(X_1 X_2 &\mid X_0 = x_0, Y_2 = y_2)) \\ &= \mathbf{E} \left( Y_1 \sqrt{\alpha + \beta x_0^2} y_2 \sqrt{\alpha + \beta Y_1^2 (\alpha + \beta x_0^2)}\middle| X_0 = x_0, Y_2 = y_2 \right) \\ &= c \mathbf{E} (Y_1 u(Y_1^2)) \end{align*} $$ the last equality by independence. Writting the last expression as an integral, you have to integrate the odd function $t \mapsto t u(t^2) n(t),$ where $n$ is the standard Guassian kernel which is symmetric. It follows that this expectation is zero. So, $(X_1, X_2)$ have zero correlation. If they were Gaussian, they would be independent. Now construct two sets $A$ and $B$ such that $P(X_1 \in A, X_2 \in B) \neq P(X_1 \in A) P(X_2 \in B)$