Normal Product and Quotient Distributions

431 Views Asked by At

Ok, my question got closed, as it was too broad. I apologize.

I feel the first four are all related, and relevant.

Random variable $X\sim N\left( \mu _{X},\sigma _{X} ^{2}\right) $, and random variable $Y\sim N\left( \mu _{Y},\sigma _{Y} ^{2}\right)$

What are the following distributions?

(1) $X+Y$,

(2) $X-Y$,

(3) $XY$,

(4) $X/Y$

My attempt:

I started by using the characteristic function, but think that may not be necessary, and only worked if X and Y were independent.

I assume that the distribution is always dictated by the mean and standard deviation, so the problem amounts to finding that for each case?

So for

1) E(X+Y) = E(X) + E(Y) = $ \mu _{X} + \mu _{Y}$

Var(X+Y) = Var(X)+Var(Y)+ $2\rho\sigma _{X}\sigma _{Y} $

So $(X+Y)\sim N\left( \mu _{x} + \mu _{y},\sigma ^{2}_{X}+\sigma ^{2}_{Y}+2\rho\sigma _{X}\sigma _{Y}\right)$

Similarly:

2) $(X-Y)\sim N\left( \mu _{X} - \mu _{Y},\sigma ^{2}_{X}+\sigma ^{2}_{Y}-2\rho\sigma _{X}\sigma _{Y}\right)$

Now I am not sure about the correlation coefficient $\rho$, is it ok to use that?

With 3) and 4) somebody pointed out that I should make use of product distribution and quotient distribution, but I do not see how.

2

There are 2 best solutions below

3
On

Let's first find $E[XY]$ and $var[XY]$ for $X,Y \sim \mathcal{N}(0,1)$, afterwards we can simply make the transformations $U = \frac{X-\mu_1}{\sigma_1}$ and $V = \frac{Y-\mu_2}{\sigma_2}$, and they would be normal distributions with mean $0$ and variance $1$.

Let, now, $X,Y\sim\mathcal{N}(0,1)$. \begin{equation} E[XY] = \int_{-\infty}^{\infty}E[XY|X=x]\cdot f_X(x)dx = \int_{-\infty}^{\infty}xE[Y|X=x]\cdot f_X(x)dx \end{equation}

Now, let us compute $E[Y|X=x]$. $$E[Y|X=x]=\int_{-\infty}^{\infty}yf_{Y|X}(y|x) = \rho x $$ Let me know if you have problems understanding where this comes from and I will explain further.

Subing this into our first equation, we get: $$E[XY] = \int_{-\infty}^{\infty}\rho x^2\cdot f_X(x)dx = \rho E[X^2] = \rho (E[X^2]-E[X])=\rho var[x]=\rho $$ The second to last equality holds as $E[X] = 0$ and the last one holds as $var[x]=1$. Hence, for the case when the means are $0$: $$E[XY] =\rho = E[XY]-E[X]E[Y] $$ And for the general one: $$\rho = E[XY]-E[X]E[Y] $$ Try the $var$ case yourself and let me know if/where you get stuck and I'll try and guide you.

To explain $E[Y|X=x]=\int_{-\infty}^{\infty}yf_{Y|X}(y|x) = \rho x$. You first need to find the marginal distribution $f_X$, which after integrating would yield to $f_X(x) = \frac{1}{\sqrt{2\pi}}e^{-\frac{1}{2}x^2}$.

Hence, $$f_{Y|X}(y|x) = \frac{f(x,y)}{f_X(x)} = \frac{1}{\sqrt{2\pi(1-\rho^2)}}exp\left(-\frac{(y-\rho x)^2}{2(1-\rho^2)}\right). $$ In this way you will get the result. It is a tedious calculation, (use Wolfram Alpha for steps in between).

Finding the distribution of the sum and difference.

$$U = X+Y\\V=X-Y$$

Which yields

$$X=\frac{U+V}{2}\\Y=\frac{U-V}{2}$$.

Now you can apply the Jacobian Formula in order to get $f_{U,V}$.

It is important to note that in order to get the sum you can simply use the convolution theorem!

Finding the distribution of the product and quotient. Again, $$U=XY\\ V=\frac{X}{Y}$$

And use the Jacobian formula to find the joint pdf. then simply find the marginal pdfs by integration.

Hope this solves most of your question!

0
On

Computing the mean and variance of a random variable $\textbf{does not suffice to determine its distribution}$ in general. Consider when $X \sim N(0, 1)$ and $Y = X$. Then $XY = X^2$ has a Chi-square distribution, which is obviously not normal.

When $(X, Y)$ is bivariate normal we are assured that any linear combination of $X$ and $Y$ is normal, so, in this quite special case, computing the mean and variance does indeed suffice for determining the distribution of linear combinations of the random variables. Computing the distribution of the product or quotient is another matter.

Without further information about the joint distribution, dependence, or correlation of $X$ and $Y$, you have been given a rather difficult problem.