Given random variable $X$, is $g(X)$ necessarily correlated to $X$ for continuous $g$ nonconstant?

46 Views Asked by At

I believe the answer to be no, as per my example below, but this seems counterintuitive.

If my counterexample is incorrect, please point out where I went wrong and/or offer a different counterexample or proof, depending on the true answer to my question.

If my counterexample is correct, what is wrong with the following logic? "Given $X(\omega)$, I can determine $Y(\omega) = g(X(\omega))$, so they must be correlated." Note that $g$ nonconstant is important here because $g$ constant trivially gives no correlation; we could also get into invertibility of $g$, e.g., when $g(x) = x^2$ and $X=\pm \alpha$ map to the same value of $Y=g(X)=\alpha^2$, but even in this case, it would seem to me that $Y = X^2$ is correlated to $X$, but I may be wrong.

Now I provide my proposed counterexample. The central idea is to use powers of a Gaussian random variable and look at even and odd powers of said random variable, which seem to give a covariance of zero.

Let $X \sim N(0,\sqrt{2})$, so its moment-generating function is \begin{align*} M(t) = e^{t^2} = \sum_{n=0}^{\infty} \frac{t^{2n}}{n!}. \end{align*} Thus differentiating $k$ times gives \begin{align*} M^{(k)}(t) = \sum_{n=0}^{\infty} \frac{(2n)!}{(2n-k)! n!} t^{2n-k} \theta_{2n,k} \end{align*} where \begin{align*} \theta_{i,j} = \begin{cases} 0 & i < j \\ 1 & i \geq j \end{cases}. \end{align*} From these equations and the definition of a moment-generating function, we see that

\begin{equation} \color{red}{\mathbb{E}[X^{k}] = M^{(k)}(0) = \begin{cases} \frac{k!}{\left( \frac{k}{2} \right)!} & k \text{ even } \\ 0 & k \text{ odd } \end{cases}} \end{equation} Now simply define $Y = X^{2m}$ and $Z = X^{2n + 1}$ for some $m,n \in \mathbb{N} \cup \{0\}$ and compute covariance. Then we have \begin{align*} Cov(Y,Z) &= \mathbb{E}[X^{2(m+n)+1}] - \mathbb{E}[X^{2m}]\mathbb{E}[X^{2n+1}] \\ &= 0 - \mathbb{E}[X^{2m}]\cdot 0 \\ &= 0 \end{align*} where the zeros come from the red equation above. Since the covariance is zero, the correlation is also zero.

1

There are 1 best solutions below

1
On BEST ANSWER

Your example is correct, $X$ and $g(X)$ need not be correlated. I think your confusion comes from the everyday interpretation of correlated as simply meaning dependent in some sense, i.e. as you say, if you know one, you know something about the other. However, in statistics these are two very different things. Clearly, $X$ and $g(X)$ are not independent, as you automatically know $g(X)$, if you know $X$. However, in its strict definition, correlation describes to what degree two variables are linearly related. In this sense, it is not really surprising that, say, $X$ and $X^2$ are uncorrelated if $X\sim N(0, 1)$, since there is no linear relation between $x$ and $x^2$.