Determining independence of a joint continuous probability distribtuion

28 Views Asked by At

I'm just confused on determining if a joint pdf has independence between it's two variables X and Y.

The theorem given to me is f(x,y) = fx(x)*fy(y) as a way to test independence, it's easy enough to understand

But another theorem was given to me as an alternative

f(x,y) = g(x)*h(y) where g(x) and h(y) are non-negative function of only x or y, respectively.

How do I determine g(x) and h(y)? Is it just a factored form of f(x,y)? And if that's the case, is independence just determined on whether or not I can factor f(x,y)?

An easy example: f(x,y) = 2y......so g(x)=y, h(y)=2? Likewise, is g(x)=4y and h(y) =.5 viable (although probably unnecessary)?

1

There are 1 best solutions below

0
On

If $f(x,y) =g(x)h(y)$ (with $g$ and $h$ non-negative and measurable) integrating w.r.t. $x$ and then w.r.t. $y$ we get $1=\int \int f(x,y)dxdy=\int (\int g(x) dx) h(y)dy=(\int g(x) dx)(\int h(y) dy)$ $\,\, $ (1). Now let $g_1=\frac g {\int g(x)dx}$ and $h_1=\frac h {\int h(x)dx}$. It follows from (1) that $g_1$ and $h_1$ are both density functions and $f(x,y)=g_1(x)h_1(y)$. It follows that $g_1$ and $h_1$ are the marginals if $f$; independence follows from this.