If I know the law of $a X+bY$ for all $a,b \in \mathbb{R}$, do I know joint law of $(X,Y)$?

87 Views Asked by At

Given two random variables $X,Y$. Suppose that know the law of each $aX+bY$ for $a,b \in \mathbb{R}$. Can I recover their joint law?

It is clear that the other way around is possible. If $(X,Y)$ is a Gaussian vector, I could compute their covariance using the trick $$ \mathbb{ E}(XY)= \frac{1}{4}E(X+Y)^2-\frac{1}{4}E(X-Y)^2, $$ and as I know their marginal laws, I can characterise the whole vector.

Can I somehow do the same in case $X,Y$ are not Gaussian? Maybe supposing that they have finite moments of every order, maybe even suppose they are the r.v's are bounded?

EDIT: Can I extend this to $n$ random variables?

1

There are 1 best solutions below

0
On BEST ANSWER

The joint distribution of $X$ and $Y$ is completely determined by the joint characteristic function defined by $\phi (a,b)=Ee^{i(aX+bY)}$. This is nothing but the characteristic function of $aX+bY$ at the point $1$. Hence, if you know the distribution of $aX+bY$ for all $a$ and $b$ you know the joint distribution. In specific cases where the joint characteristic function is integrable it is also possible to write a formula for the joint density. ( This is the inversion theroem).