When does multiplication with a scalar rv preserve independence?

301 Views Asked by At

Let $X_1,\ldots,X_n$ be independent random variables on $\mathbb{R}$. Let $f:\mathbb{R}^n \rightarrow \mathbb{R}$ be a (deterministic) function. Under which conditions is the new set of random variables \begin{align*} f(X_1,\ldots,X_n)X_1,\ldots,f(X_1,\ldots,X_n)X_n \end{align*} independent?


A negative case is:

  • if $f(x_1,\ldots,x_n)=\frac{1}{\sum\limits_{i=1}^n x_i}$, then the new random vector is not independent because it must sum to one. The same argument works with all kinds of normalizations. It might still be that any subset of $n-1$ random variables is independent though.

I'm particularly interested in "false normalizations" for continuous random variables:

  • $f(x)=\frac{1}{\sum\limits_{i=1}^n x_i^2}$

This is a false normalization because now, the (Euclidian) norm of the new vector is not deterministic, so I cannot just express one random variable as a function of the others. This is in contrast to the "true normalization" $f(x)=\frac{1}{\sqrt{\sum\limits_{i=1}^n x_i^2}}$ where we can immediately deduce that the (Euclidian) norm of the new vector is one so that the new vector cannot be jointly independent. Any insights here would be highly appreciated.