If $X$ is a random variable and I have another r.v. $Y = f(X)$, is $Y$ always dependent on $X$?

69 Views Asked by At

If $X$ is a random variable and I have another r.v. $Y = f(X)$, is $Y$ always dependent on $X$?

I don't think so, becausein this Q: How to show that $Y_1, Y_2$ are independent

$Y_2 = N - Y_1$ (which looks like the value of $Y_2$ must change depending on $Y_1$, but it's provable that $Y_1,Y_2$ are independent.)

If we have a random variable $X$, then is something like $Y = 5X$ always not independent?

3

There are 3 best solutions below

4
On

Well, if X is constant, then X is independent of Y for all Y, including Y=f(X), which will simply be some other constant.

If f and X are such that there is a number z such that f(X)=z with probability one, then Y=z with probability one and is thus independent of X.

Otherwise there is a w such that $P(Y>w)>0$ and $P(Y<w)>0$. But then there are disjoint sets A, B with $X\in A$ when $Y<w$ and $X\in B$ when $Y>w$. Thus X and Y are not independent.

To sum up, X and Y are independent in this case if and only if Y is equal to a constant with probability one (or "almost surely" as they say in probability theory).

By the way, in your example with $Y_{2}=N-Y_{1}$, assuming N to be a constant, the assertion is false unless $Y_{1}$ is almost surely constant.

0
On

The answer is NO though. I'm not sure to which level your question is meant to ask, I'll give you a counter example. But I guess in most "normal cases", they should be dependent.

Here is the definition of two random variables being independent: $X$ and $Y$ are independent, if and only if events $A$ and $B$ are independent, that is $\mathbb P(A \cap B) = \mathbb P(A) \cdot \mathbb P(B)$, whenever $A \in \sigma(X)$ and $B \in \sigma(Y)$, where $\sigma(X)=\{X^{−1}(A)|A \in \mathcal B(\mathbb R)\}$

Now let $X$ be a r.v. and let $f$ be a real valued function s.t. both $f$ and $f^{-1}$ are Borel measurable. Let $Y = f \circ X$ s.t. $\mathbb P(Y \le y) = 0 \text{ or } 1$. Then $X$ and $Y$ are independent.

$$\sigma(X)=\{X^{-1}(A)|A \in \mathcal B(\mathbb R)\}=\{X^{-1}\circ f^{-1}\circ f(A)|A \in \mathcal B(\mathbb R)\}$$

$$\sigma(Y)=\{(f\circ X)^{-1}(A)|A \in \mathcal B(\mathbb R)\}=\{X^{-1} \circ f^{-1}(A)|A \in \mathcal B(\mathbb R)\}$$

We have $\sigma(X)=\sigma(Y)$, and this is because: $f$ Borel-measurable, we get $\sigma(Y)\subset \sigma(X)$; $f^{-1}$ Borel-measurable, we get $\sigma(X) \subset \sigma(Y)$.

Now take a look at $\mathbb P (\{\omega: X(\omega) \le x \} \cap \{\omega: Y(\omega) \le y)\})$, no matter what value of $x, y$ we take, it will be the same as $\mathbb P(\{\omega: X(\omega) \le x\}) \cdot\mathbb P (\{\omega: Y(\omega) \le y\})$.

This is because:

If we take $y$ s.t. $\mathbb P(Y \le y)=0$, then $\{X \le x\} \cap \{Y \le y\} \subset \{Y \le y\}$, and thus $$0 \le \mathbb P (\{\omega: X(\omega) \le x \} \cap \{\omega: Y(\omega) \le y)\}) \le \mathbb P (X \le x) = 0$$

If we take $y$ s.t. $\mathbb P (Y \le y) = 1$, $\mathbb P( Y > y) = 0$, then $\{X \le x\} \cap \{Y \le y\} = \{ X \le x \} \setminus \{Y > y\}$, thus $$P(X \le x)=\mathbb P(X \le x) - \mathbb P(Y > y) \le \mathbb P (\{\omega: X(\omega) \le x \} \cap \{\omega: Y(\omega) \le y)\})$$ $$= \mathbb P(\{ X \le x \} \setminus \{Y > y\}) \le \mathbb P (X \le x)$$

Thus we have:

$$\mathbb P(X \le x, Y \le y) = \mathbb (X \le x) \cdot \mathbb (Y \le y)$$

So $X$ and $Y$ are independent in this example.

0
On

By the way, in a different question (How to show that $Y_1, Y_2$ are independent) you gave an example where N is a random variable and $Y_{1}$ and $Y_{2}$ are independent. In this question you didn't say what $N$ was, so it was taken as a constant, since $N$ is often used for a constant. So if you are wondering why answers to this question seem inconsistent with answers to that question, it's because the questions aren't as similar as you thought.