Joint p.d.f and independence

52 Views Asked by At

I kinda remember there is a result like this from Probability theory, but I forgot how to prove it. Is there a formal name for it? Can someone kindly provide me with the proof or a link please

The random variables $X$ and $Y$ with density $f$ are independent if and only if there exist $g$ and $h$ such that $f(x,y)=g(x)h(y)$ for (almost) every $(x,y)$ in R×R.

1

There are 1 best solutions below

0
On

I do not know the name of this theorem.


Let us assume that $X,Y$ are independant. Let $a,b$ be two positive functions.

$$ E(a(X)b(y)) = \int a(x)b(y) f(x,y) dx \ dy $$

  • Now if $f(x,y)=g(x)h(y)$ is a factorization (a.e.) with $\int h(y) dy = 1$, then $$ Ea(X) =\int a(x) g(x) h(y) dx \ dy = \int a(x) g(x) dx $$hence $g$ is a version of the density of $X$ (and the same for $h$ and $Y$) and $g(x) = \int f(x,y) dy$ and $h(y) = \int f(x,y) dx$.
  • Let us write the independance: $$ E(a(X) b(Y)) = Ea(X) Eb(Y) \\ \int a(x)b(y) f(x,y) dx \ dy = \int a(x) g(x) dx \times \int b(y) h(y) dy = \int a(x)b(y) g(x)h(y) dx\ dy $$using Tonelli theorem (all is positive). As it is true for every $a,b$ positive valued, $$ f(x,y)=g(x)h(y) $$almost everywhere.