$X$, $Y$ independent $\Rightarrow$ $f(X)$, $f(Y)$ independent

170 Views Asked by At

I am a statistics student who is simply trying to see if $X$ and $Y$ are independent then $\log X$ and $\log Y$ are independent as well.

I tried to look for information regarding this and I see people mentioning Borel functions all the time.

I am not familiar with its definition and with the courses I will be taking in the future it is not likely that I will ever study set theory or any topology classes that may lead me to fully learn this.

So, it boils down to this question.

Can any of the following elementary functions be an example where

"$X$ independent from $Y$ but $f(X)$ is dependent on $f(Y)$?"

$$f(x)=x^n$$ $$f(x) = a^x$$ $$f(x)=\log_a(x)$$ $$f(x)=x^{\frac{1}{n}}$$ $$f(x)...\text{Trig functions such as} \quad \sin(x), \cos(x).$$ $$f^{-1}(x)...\text{Inverse trig functions such as } \quad \arcsin(x), \arccos(x).$$

Intuitively I want to say that when $X$ and $Y$ are independent, invertible functions would produce independence in $f(X)$ and $f(Y)$ but I would like to know if that is the case. Or, at least, it is true for these elementary functions that are taught in highschool

2

There are 2 best solutions below

1
On BEST ANSWER

A more elementary definition of independence for real random variables:

$X$ and $Y$ are independent means $$P(X\le x, Y\le y)=P(X\le x)P(Y\le y)\tag1$$ for every $x$ and $y$.

So to show that $\log X$ and $\log Y$ are independent (assuming $X$ and $Y$ are positive random variables), it's enough to verify $$P(\log X\le x, \log Y\le y) = P(\log X\le x)P(\log Y\le y)\tag2$$ for every $x$ and $y$. To do so, write the LHS of (2) as $$P(X\le e^x, Y\le e^y)$$ which by (1) equals $$P(X\le e^x)P(Y\le e^y)$$ which in turn equals $$P(\log X\le x)P(\log Y\le y).$$ But this is the RHS of (2), and we're done! The same argument proves that $h(X)$ and $h(Y)$ are independent when $h$ is a monotonic function. If you're familiar with the concept of an "inverse image" $h^{-1}(A)$ for a given set $A$ you can establish this result for arbitrary "Borel measurable" functions $h$.

1
On

If $X$ and $Y$ are independent of each other, then knowing the value of one of them gives you no information about the distribution of the other.

As a result, it doesn't matter what function you apply - the distributions are still unrelated to each other and you can't gain information about one from the value of the other.