If we have random variables $X,Y$ and we know that $P(X=5)=1$, can we immediately show that $X, Y$ are independent? Intuitively it seems that this information is enough for us to say that we don't need $Y$ to know $X$, but couldn't $Y$ be a function of $X$? Then the joint PMF $P(X,Y) = P(X)P(Y)$, but also $= P(Y)$. How can we justify independence in such cases? Mathematically it works, but it seems we're claiming that a random variable is independent to itself.
2026-03-31 03:30:56.1774927856
Independence of random variables where probability = 1
92 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
Constant or almost surely constant random variables are independent of any other random variable (including itself!) This follows from the fact that an event that has probability 0 or 1 is independent of any other event (including itself!). Let us see why:
Independence of events $A$ and $B$ is defined as $P(A \cap B) = P(A)P(B)$
Now, independence of random variables $X$ and $Y$ can be defined in many ways (in elementary probability theory). One way is that the joint pdf splits up (assuming $X$ and $Y$ have pdfs and a joint pdf) and another is that the joint cdf splits up:
$$P(X \le a)P(Y \le b) = P(X \le a, Y \le b) \tag{*}$$
for $a,b \in \mathbb R$
Define
$$A = \{X \le a\}$$ $$B = \{Y \le b\}$$
Then we have a familiar form:
$$P(A)P(B)=P(A \cap B) \tag{**}$$
I believe you can handle the rest from the second link above but just in case...
As for your other concern about $Y$ being a function of $X$:
If $X$ is (almost surely) constant then so is $Y$.
For example if $P(X=5)=1$ and $Y=5X^2+\sin(X)$ then $P(Y = 5X^2+\sin(X) = 125+\sin(5)) = 1$.