Compute pdf of correlated part from joiunt probability of two variables.

70 Views Asked by At

Can I compute pdf(z), if I known joint probability density pdf(X+z,Y+z)? X, Y and z are independent, zero mean, variables. pdf of X,Y,z are unknown, we can assume that pdf(X), pdf(Y) are gaussian. We can also assume that variance of X,Y is much larger than that of z. This last condition effectivly prevents measuring pdf(X,Y) by setting z=0 and than using deconvolution.

There is an indirect method, so I know that the problem is solvable. From pdf(X+z,Y+z) I can compute the cross-moments M(n,m). These are equal to M(n,m)=M_z(n+m) if n or m are odd, since all the terms having X,Y or z to an odd power will average to 0. Now from the moments of z we can compute it's pdf.

1

There are 1 best solutions below

4
On

Let the pdfs of $X$, $Y$, and $Z$ be denoted by $f_X(x)$, $f_Y(y)$, and $f_Z(x)$, respectively.

I don't immediately see how this problem can be solved when $f_X(x)$ and $f_Y(y)$ are unknown. However, you say that $X$ and $Y$ are mean $0$ as well as Gaussian, so the only missing pieces of information are their variances. The following solution should work if $f_X(x)$ and $f_Y(y)$ are known.

Define new random variables $U = X + Z$ and $V = Y + Z$. The joint pdf of $U$ and $V$ is $$ f_{UV}(u, v) = \frac{\partial}{\partial u}\frac{\partial}{\partial v} F_{UV}(u, v)\, , $$ where $F_{UV}(u, v)$, the joint cdf of $u$ and $v$, is given by \begin{align} F_{UV}(u, v) &= P\left(x+z < u,\; y+z < v\right)\\[0.1in] &= P\left(x < u-z,\; y < v-z\right)\\[0.1in] &= \int_{-\infty}^{+\infty} dz\, f_Z(z) \int_{-\infty}^{u-z} dx\, f_X(x) \int_{-\infty}^{v-z} dy\, f_Y(y)\, . \end{align}

The joint pdf of $U$ and $V$ then becomes: \begin{align} f_{UV}(u, v) &= \frac{\partial}{\partial u}\frac{\partial}{\partial v} F_{UV}(u, v)\\[0.1in] &= \int_{-\infty}^{+\infty} dz\; f_Z(z)\; f_X(u-z)\; f_Y(v-z) \qquad\qquad (1) \end{align}

Define the two-dimensional Fourier transform of the pdf $f_{UV}(u, v)$ as \begin{equation} \hat{f}_{UV}(s, t) \;\equiv\; \int_{-\infty}^{+\infty}du\, e^{-isu} \int_{-\infty}^{+\infty}dv\, e^{-itv} f_{UV}(u, v)\, , \end{equation} where $s$ and $t$ are frequencies.

Using this definition, take a Fourier transform of both sides of Eq. (1) above to yield \begin{align} \hat{f}_{UV}(s, t) &= \int_{-\infty}^{+\infty} dz\, f_Z(z)\; \int_{-\infty}^{+\infty} du\, e^{-isu}\, f_X(u-z)\; \int_{-\infty}^{+\infty} dv\, e^{-itv}\, f_Y(v-z)\\[0.1in] &= \int_{-\infty}^{+\infty} dz\,e^{-i(s+t)z}\, f_Z(z)\; \hat{f}_{X}(s)\; \hat{f}_{Y}(t)\\[0.1in] &= \hat{f}_{Z}(s+t)\; \hat{f}_{X}(s)\; \hat{f}_{Y}(t)\, . \end{align} Between the first and second lines above, we have made a change of variables $u\rightarrow u+z$, $v\rightarrow v+z$.

From here, one can set either $s = 0$ or $t = 0$, to yield $$ \hat{f}_Z(t) = \frac{\hat{f}_{UV}(0, t)}{\hat{f}_Y(t)} $$ or $$ \hat{f}_Z(s) = \frac{\hat{f}_{UV}(s, 0)}{\hat{f}_X(s)}\, , $$ respectively. Note that we have used the fact that $\hat{f}_X(0) = \hat{f}_Y(0) = 1$, which must be true for the Fourier transform of any pdf.

Assuming that all of the relevant pdfs are analytically known originally, all that is left to do in principle is to take an inverse Fourier transform of one of the above expressions for $\hat{f}_Z$: \begin{align} f_Z(z) &= \frac{1}{2\pi}\int_{-\infty}^{+\infty}dt\, e^{+itz}\, \hat{f}_Z(t) \;=\; \frac{1}{2\pi}\int_{-\infty}^{+\infty}dt\, e^{+itz}\, \frac{\hat{f}_{UV}(0, t)}{\hat{f}_Y(t)}\\[0.1in] &= \frac{1}{2\pi}\int_{-\infty}^{+\infty}ds\, e^{+isz}\, \hat{f}_Z(s) \;=\; \frac{1}{2\pi}\int_{-\infty}^{+\infty}ds\, e^{+isz}\, \frac{\hat{f}_{UV}(s, 0)}{\hat{f}_X(s)} \end{align}

Edit:

It strikes me that $\hat{f}_{UV}(s, 0)$ and $\hat{f}_{UV}(0, t)$ are the Fourier transforms of the marginal pdfs $f_{U}(u)$ and $f_{V}(v)$ of $U$ and $V$, respectively. The final result above could therefore have been obtained in a much shorter way as follows: $$ f_{U}(u) = \int_{-\infty}^{+\infty}dz\, f_Z(z)\, f_X(u-z) \;\;\rightarrow\;\; \hat{f}_U(s) = \hat{f}_Z(s)\, \hat{f}_X(s) $$