If random variables $X, Y$ have joint PDF $f(x, y) = g(x)h(y)$ for some functions $g$ and $h$ then $X, Y$ are independent?

1.1k Views Asked by At

I'm reading All of Statistics by Larry Wasserman, and I'm confused by Theorem 2.33:

Suppose that the range of $X$ and $Y$ is a (possibly infinite) rectangle. If $f(x, y) = g(x)h(y)$ for some functions $g$ and $h$ (not necessarily probability density functions) then $X$ and $Y$ are independent.

How can I prove this statement, or get some intuition behind it? Does this also mean that two dependent variables and two independent variables cannot have the same joint PDF?

Thank you!

2

There are 2 best solutions below

0
On BEST ANSWER

The answer is yes to both questions and you can prove them using Fubini's Theorem.

0
On

@Kavi was right about Fubini's Theorem, but I wanted to write out a detailed solution:

If we fix $x$, note that

$$ f_X(x) = \int_{-\infty}^{\infty} f(x, y) dy = g(x) \int_{-\infty}^{\infty} h(y) dy $$

By symmetry:

$$ f_Y(y) = \int_{-\infty}^{\infty} f(x, y) dx = h(y) \int_{-\infty}^{\infty} g(x) dx $$

Multiplying, we have:

\begin{align} f_X(x) f_Y(y) &= g(x)h(y) \int_{-\infty}^{\infty} h(y) dy \int_{-\infty}^{\infty} g(x) dx \\ &= f(x,y) \int_{-\infty}^{\infty} \int_{-\infty}^{\infty} f(x,y) dx dy \\ &= f(x, y) \end{align}

So $X, Y$ must be independent.