Determining the distribution of a variable using moments

33 Views Asked by At

Is this proof right? My only problem in my solution is that my "proof" would be true for the general case and not only when $U ∼ U(0, 1)$ .

Prompt

Is it possible for $X, Y, Z$ to have the same distribution and satisfy $X = U(Y +Z),$ where $U ∼ U(0, 1)$, and Y, Z are independent of $U$ and each other?

Proof

$$M_x (t) = E(e^{tx})$$ $$M_y (t) = E(e^{ty})$$ $$M_z (t) = E(e^{tz})$$

$Z + Y ∼ M_z(t) + M_y(t) => E(e^{tz}) + E(e^{ty}) = E(e^{ty} + e^{tz})$ . Now we have $U ∼ U(0,1)$ and its moment generating function is denoted as $M_u (t) = E(e^{ut})$. Then $$U(Y + Z) ∼ M_u(t) * (M_y(t) + M_z(t)) = E(e^{tu})E(e^{yt} + e^{zt}) = E(e^{tu} (e^{yt} + e^{zt}))$$

However this form is not the same as that of $M_x(t) = E(e^{tx})$, so it is not possible.

1

There are 1 best solutions below

0
On BEST ANSWER

Let us assume that the variables $U\sim \mathcal{U}(0,1)$, and $Y$, $Z$ are i.i.d. continuous random variables independent from $U$. Therefore, the joint probability distribution function (PDF) of the random variable $(U,Y,Z)$ is the product of the PDFs of its components. We consider the function $g: (u,y,z) \mapsto \left(u(y+z),y,z\right)$, and want to determine the PDF $f_{(X,Y,Z)}$ of the variable $(X,Y,Z) = g(U,Y,Z)$. To do so, one can use the change of variable method. Thus, we consider a measurable bounded function $h$, and compute the expected value $\mathbb{E}h(X,Y,Z)$: \begin{aligned} \mathbb{E}(h\circ g)(U,Y,Z) &= \int_{\mathbb{R}^3} (h\circ g)(u,y,z)\, f_U(u) f_Y(y) f_Z(z) \,\mathrm{d}u\,\mathrm{d}y\,\mathrm{d}z \, ,\\ &= \int_{\mathbb{R}^3} h\!\left(x,y,z\right) f_U\!\left(\frac{x}{y+z}\right) f_Y(y) f_Z(z) \frac{1}{|y+z|}\,\mathrm{d}x\,\mathrm{d}y\,\mathrm{d}z \, ,\\ &= \int_{\mathbb{R}^3} h\!\left(x,y,z\right) \underbrace{\mathbf{1}_{0\leq x/(y+z)\leq 1} \frac{f_Y(y) f_Z(z)}{|y+z|}}_{f_{(X,Y,Z)}(x,y,z)}\,\mathrm{d}x\,\mathrm{d}y\,\mathrm{d}z \, . \end{aligned} The condition for $X$, $Y$ and $Z$ to be identically distributed writes $f_X(x) = f_Y(x) = f_Z(x)$ for all $x$, where $f_X$ denotes the marginal PDF of $X$. If $X$, $Y$ and $Z$ are positive, it amounts to $$ f_X(x) = \int_{\mathbb{R}^2} \mathbf{1}_{0\leq x\leq y+z} \frac{f_X(y) f_X(z)}{y+z}\,\mathrm{d}y\,\mathrm{d}z \, . $$ Maybe somebody can deduce something out of it.