Independence of $X$ and $X^2$

1.9k Views Asked by At

If I have a random variable $X$ where the pdf is always positive, are $X$ and $X^2$ independent? Never, sometimes, or always? I was thinking to either find the pdf of $X^2$ or the cdf. Then find the joint pdf/cdf. But without the actual value of the pdf for $X$ I'm not sure how to do that. Is there another way? Any help would be appreciated!

Edit: pdf is positive for any real value of X

Edit 2: X is a continuous random variable

3

There are 3 best solutions below

5
On BEST ANSWER

If $X$ is a non-negative RV and not constant, then $X$ and $X^2$ are positively correlated, and hence not independent. Here's why: The covariance between $X$ and $X^2$ is by definition the expectation of $$ (X-EX)\left(X^2 - E(X^2)\right).\tag1 $$ Rewrite (1) as $$ (X-EX)\left(X^2-(EX)^2\right) + (X-EX)\left((EX)^2-E(X^2)\right).\tag2 $$ The second term of (2) has expectation zero. Meanwhile, the first term of (2) can be rewritten $$ (X-EX)^2(X + EX),\tag3 $$ which shows that (3) is a nonnegative and non-constant RV. Therefore (3) has positive expectation, so so does (1), which means the required covariance is positive.

(Assuming all required expectations exist.) Geometrically, the random pair ($X$, $X^2$) lives on the right half of the curve $y=x^2$, so the correlation is positive if $X$ is not constant.

2
On

Surely $X $ and $X^2$ are not independent! As soon as you know the value of one, the value of the other becomes determined!

0
On

Given that $X^2\in A^2$ for some set $A$ in the Borel σ-algebra of $\mathbb R$, then you know that $X\in \pm A$. Take $A$ such that $0<P(X\in \pm A)<1$ (this $A$ exists due to your assumption that $f(x)>0$ for all $x\in \mathbb R$), then $$P(X\in \pm A\mid X^2\in A^2)=1\neq P(X\in \pm A)$$ hence $X$ and $X^2$ are always dependent under this assumption. Otherwise, (see the comment in the other answer) you can create a degenerate random variable for which $X$ and $X^2$ are independent.