I'm working through a course in Probability (2nd/3rd year) and would like to clarify some idea on joint distributions.
Suppose for example we have independent random variables $(Z_1,Z_2)$ from a distribution, which we will take to be standard normal, i.e. $N[0,1]$, and then we define variables $X(Z_1, Z_2)$ and $Y(Z_1, Z_2)$, functions of $Z_1,Z_2$, how do we find the marginal distributions of $X$ and $Y$ and their joint distribution?
If we look at a specific case, say for example $X=Z_1+Z_2$ and $Y=2Z_1+Z_2$, how could we find the conditional expectation of $Y$ say if we fix a value for $X$, so say $E[Y|X=\alpha]$ for some $\alpha >0$?
I would be very appreciative of anyone who could help clarify these ideas. Best, MM.
Where you are dealing with sums of independent normally distributed random variables (possibly with different variances) then you can find a discussion in the question and answer of stats.stackexchange.com/questions/9071/
For your specific question, you can use this to see $E[Z_1|Z_1+Z_2=\alpha] = \alpha/2$ so $$E[Y|X=\alpha]= E[2Z_1+Z_2|Z_1+Z_2=\alpha]=3\alpha/2,$$ which is fairly intuitive.
You can also use this to see $E[2Z_1|2Z_1+Z_2=\beta] = 4\beta/5$, so $E[-Z_1|2Z_1+Z_2=\beta] = -2\beta/5,$ so $$E[X|Y=\beta]=E[Z_1+Z_2|2Z_1+Z_2=\beta] = 3\beta/5,$$ which I think is less intuitive.