Continuous random variables conditioned on discrete random variables

74 Views Asked by At

Suppose I have two continuous random variables $X_1 \sim N(0,\sigma_1^2)$ and $X_2 \sim N(0,\sigma_2^2)$

Then I have a discrete random variable $Z$ and $\theta$ such that $$ \theta = \begin{cases} \theta_1,& p\\ \theta_2,& (1-p) \end{cases} $$ $$ Z = \begin{cases} X_1,& \theta=\theta_1\\ X_2,& \theta =\theta_2 \end{cases} $$

I would like to compute $P(Z|\theta)$, $P(Z,\theta)$, and $P(Z)$.

Here's my attempt, I am wondering if I can tackle this problem similar to discrete cases.

I start with a conditional probability matrix \begin{array}{l|ll} P(Z|\theta) &X_1 &X_2 \\ \hline \theta_1 &f(x_1) &0 \\ \theta_2 &0 &f(x_2) \end{array}

Here, the rows of the conditional probability matrix integrate/sum to 1. Would this be a valid probability matrix?

So $P(Z|\theta = \theta_1) = [f(x_1), \ 0]$.

Then $P(Z\in X_1|\theta=\theta_1) = 1$ since $f(x_1)$ integrates to 1.

So $P(Z=x_1|\theta=\theta_1) = f(x_1)$

Then, $$ P(Z |\theta) = \begin{cases} f(x_1),& Z=x_1\\ f(x_2),& Z=x_2 \end{cases} $$

Now for computing $P(Z,\theta) = P(Z|\theta)P(\theta)$, I have a joint probability matrix, \begin{array}{l|ll} P(Z,\theta) &X_1 &X_2 \\ \hline \theta_1 &pf(x_1) &0 \\ \theta_2 &0 &(1-p)f(x_2) \end{array}

which is

$$ P(Z ,\theta) = \begin{cases} pf(x_1),& \theta=\theta_1\\ (1-p)f(x_2),& \theta=\theta_2 \end{cases} $$

Now using this logic, to find $P(Z)$ we have $P(Z) = \sum_\theta P(Z,\theta)$, then,

$$ P(Z) = [pf(x_1), \ (1-p)f(x_2)] $$

So then

$$ P(Z) = P(Z,\theta) = \begin{cases} pf(x_1),& Z=x_1\\ (1-p)f(x_2),& Z=x_2 \end{cases} $$

I'm not sure if making use of the probability matrix is the right approach to take and if this is correct. Though using matrices does seem handy in visualization, I don't know if this would generalize well. Further, would $Z$ be considered a random vector? Apparently, the variance of $Z$ is $Var(Z) = p\sigma_1 + (1-p)\sigma_2$, but I'm not sure how to find that.

Any help, guidance, comments would be appreciated :) Thank you.

1

There are 1 best solutions below

0
On BEST ANSWER

As defined, $Z$ is continuous, with a probability density function given by the Law of Total Probability.

Now, presuming the Bernoulli random variable, $\theta$, is independent of the Normal random variables, $X_1$ and $X_2$...

$$\begin{align}f_{\small Z\mid\theta}(z\mid\theta)&= f_{\small X_1}(z)\,\mathbf 1_{\theta=\theta_1}+f_{\small X_2}(z)\,\mathbf 1_{\theta=\theta_2}\\[2ex] f_{\small Z,\theta}(z,\theta)&= p\,f_{\small X_1}(z)\,\mathbf 1_{\theta=\theta_1}+(1-p)\,f_{\small X_2}(z)\,\mathbf 1_{\theta=\theta_2}\\[2ex]f_{\small Z}(z) &=f_{\small Z,\theta}(z,\theta_1)+f_{\small Z,\theta}(z,\theta_2) \\[1ex] &= p\,f_{\small X_1}(z)+(1-p)\,f_{\small X_2}(z)\\[1ex]&=\left(\dfrac{p\,\mathrm e^{-z^2/2\sigma_1^2}}{\sigma_{\small 1}\sqrt{2\pi~}}+\dfrac{(1-p)\,\mathrm e^{-z^2/2\sigma_2^2}}{\sigma_{\small 2}\sqrt{2\pi~}}\right)\,\mathbf 1_{z\in\Bbb R}\end{align}$$


NB: The bold faced $\bf 1$ is an indicator function, equalling $1$ when the event occurs and $0$ otherwise.

$$\mathbf 1_E=\begin{cases} 1 &:& E\\0&:& \text{otherwise}\end{cases}$$