Minimum of functions of random variables: probability mass function

61 Views Asked by At

I have $X_1, \ldots, X_n$ random variables independent and identically distributed with probability density function $p_X$. I want to compute the probability density function for the random variable $Z = \min\{f(X_1), \ldots, f(X_n)\}$. Let $Y_i = f(X_i)$, then if I assume that $f$ is monotone I can compute $p_Y$ as:

\begin{equation} p_Y(y) = p_X(f^{-1}(y))\left|\frac{d}{dy}(f^{-1}(y))\right|. \end{equation}

I can then compute the cumulative distribution function $F_Z$ as:

\begin{equation} F_Z(z) = P(Z\leq z) = P(\min\{Y_1,\ldots,Y_n\}\leq z) = 1-P((Y_1>z)\wedge\ldots\wedge (Y_n>z)) = \\ 1-P(Y_1>z)\ldots P(Y_n>z) = 1-(1-F_Y(z))^n. \end{equation}

Finally for the probability density function $p_Z$ I get:

\begin{equation} p_Z(z) = \frac{d}{dz}F_Z(z) = (n-1)(1-F_Y(z))^{n-1}p_Y(z) = (n-1)(1-F_Y(z))^{n-1}p_X(f^{-1}(z))\left|\frac{d}{dy}(f^{-1}(z))\right|. \end{equation}

First of all is the logic above correct? Can I extend the above to using a probability mass function? To me it looks like that in that case the identity would be as simples as: $p_Y(y) = p_X(f^{-1}(y))$. I believe the rest will remain the same? What if I allow $f$ to be non-monotone and potentially non-invertible?

1

There are 1 best solutions below

3
On BEST ANSWER

Yes the logic is correct.

If $X$ is discrete (i.e., it has a probability mass function) and $Y = f(X)$ then the PMF of $Y$ is $$P(Y=y) = \sum_{x \in f^{-1}(\{y \})} P(X=x)$$ where $f^{-1}(\{y\})$ is the set of points that $f$ maps to $y$ (if there are no such points then the probability is $0$). When $f$ is invertible then $f^{-1}(\{y\})$ is a single point and is equal to your definition of $p_Y$. See Casella and Berger chapter 3.1 for more details.