Assume
$Z=X+Y$, given that $X \sim N(\mu_x,\sigma_x^2)$ and $Y \sim N(\mu_y,\sigma_y^2)$ are independent.
I'm interested on computing the joint distribution $f_{ZX}$.
What I think:
Well, I know the distribution of $Z\sim N(\mu_x+\mu_y , \sigma_x^2+\sigma_y^2 )$. Also we know the distribution of $X$.
To find the joint I can think of something like findind the joing CDF and then try to do derivatives, though, I suspect this might not be the best way to do it.
I'm thinking on something like:
$F_{ZX} = P(Z \leq z, X \leq x) $
Then, the final expression would be something like $\min(\cdot , \cdot)$ where the dots represent some argument, as I see that to satisfy both conditions I will need the $\min$ operator.
Anybody knows a better way? thanks!
$(Z,X) = (X+Y, X)$ is obtained from $(X,Y)$ via a linear transformation and since $(X,Y)$ has a bivariate Gaussian distribution, so does $(Z, X)$ have a bivariate Gaussian distribution. This is a property (actually a defining property) of multivariate Gaussian distributions: linear transformations preserve multivariate Gaussianity.
So, all you need to do is figure out the mean and variance of $Z=X+Y$ (the mean and variance of $X$ are known to you), and then the covariance of $Z$ and $X$ and then you can write down the distribution of $(Z,X)$ with very little effort.