composition of probability distribution functions

3.8k Views Asked by At

Suppose we are given $X \sim \mathcal{N}(\mu,\Sigma)$. Then, we define the random variable $Y$ as follows:

$Y_i = 1 + X_i $ if $X_i \ge 0$

$Y_i = \exp(X_i)$ if $X_i \lt 0$.

How do I go about calculating the probability density of Y? And $E[Y_i]$ for all $i$? (not a homework problem - it is taken from a paper where the Y model the prior distribution of surface emissions of methane...)

Is the derivation in @martini's reply correct if the $X$ are correlated ($\Sigma$ is not diagonal)?

1

There are 1 best solutions below

2
On BEST ANSWER

For the density: Note that $X_i \sim N(\mu_i, \Sigma_{ii})$. Hence, for $t \in \mathbf R^+$, $t \le 1$, we have \begin{align*} \def\P{\mathbf P}\P(Y_i \le t) &= \P(e^{X_i} \le t)\\ &= \P(X_i \le \log t)\\ &= \frac 1{\sqrt{2\pi\Sigma_{ii}}} \int_{-\infty}^{\log t} \exp\bigl(-(x-\mu_i)^2/2\Sigma_{ii}\bigr)\, dx \end{align*} For $t \ge 1$, we have \begin{align*} \P(Y_i \le t) &= \P(Y_i \le 1) + \P(1 \le Y_i \le t)\\ &= \P(Y_i \le 1) + \P(0 \le X_i \le t-1)\\ &= \P(X_i \le 0) + \P(0 \le X_i \le t-1)\\ &= \P(X_i \le t-1)\\ &= \frac 1{\sqrt{2\pi\Sigma_{ii}}} \int_{-\infty}^{t-1} \exp\bigl(-(x-\mu_i)^2/2\Sigma_{ii}\bigr)\, dx \end{align*} Taking derivatives, we say that $Y_i$'s density is given by $$ f_i(t) = \begin{cases} 0 & t \le 0\\ \frac 1t \cdot (2\pi\Sigma_{ii})^{-1/2} \exp\bigl(-(\log t - \mu_i)^2/2\Sigma_{ii}\bigr) & 0< t \le 1\\ (2\pi \Sigma_{ii})^{-1/2}\exp\bigl(-(t-1 - \mu_i)^2/2\Sigma_{ii}\bigr) & t > 1 \end{cases} $$ For the expectation, compute $\int_{\mathbf R} tf_i(t)\, dt$.