How to find the variance of a variable defined by a random variable?

85 Views Asked by At

I need help with this problem:

If $X$ is a uniform random variable over $(0,1)$, define another variable $Y$ by: $Y =\left\{ \begin{array}{lcc} \\ 1 & , & X\leq0.5 \\ \\ 2 & , & X >0.5 \end{array}\right.$

Find the variance of Y.

Ok, I know that $Var(Y)=E(Y^2)-(E(Y))^2$, so I tried to find the expected value first. So I tried to find it like this: $$E(Y)=\int_0^{0.5} 1\cdot1dx+\int_0^{0.5} 2\cdot1dx=0.5+2(0.5)=1.5$$ I'm not sure if what I'm doing is correct, hope you can help me.

3

There are 3 best solutions below

7
On BEST ANSWER

An elementary way to handle this exercise would be using the theorem $$ E[g(X)]=\int _{-\infty}^{\infty }g(t)f_X(t)\,\mathrm d t\tag1 $$ named as the law of the unconscious statistician (in this case stated for a continuous random variable $X$). Then we need to find some function $g:\Bbb R\to\Bbb R$ such that $g(X)=Y$.

Then setting: $$ g(t):=\begin{cases} 1,&t\leqslant 1/2\\ 2,&t>1/2 \end{cases}\tag2 $$ we find that $g(X)=Y$ so we can apply $\rm(1)$ knowing that in this case $f_X:=\mathbf{1}_{(0,1)}$, where $\mathbf{1}_{A}$ is the indicator function of the set $A$.

Hence $$ \begin{align*} E[Y]=E[g(X)]=\int_{0}^1g(t)\,\mathrm d t=\frac1{2}+2\cdot \frac1{2}=\frac{3}{2}\tag3\\ E[Y^2]=E[g(X)^2]=\int_{0}^1(g(t))^2\,\mathrm d t=\frac1{2}+4\cdot \frac1{2}=\frac{5}{2}\tag4 \end{align*} $$ where we deduced that $\operatorname{Var}[Y]=\frac1{4}$.


Another way to handle this exercise is using directly measure theoretic definitions and concepts, in this case we will have that $$ \begin{align*} E[Y]:&=\int_\Omega Y(\omega )dP(\omega )\\ &=\int_{X^{-1}(-\infty ,1/2]}dP(\omega )+\int_{X^{-1}(1/2,\infty )}2 dP(\omega )\\ &=\int_{(-\infty,1/2]}f_X(t)\,\mathrm d t+2\int_{(1/2,\infty )}f_X(t)\,\mathrm d t\\ &=\int_{(0,1/2]}\,\mathrm d t+2\int_{(1/2,1)}\,\mathrm d t\\ &=\frac{3}{2} \end{align*} $$ because $X$ have as density $f_X:=\mathbf{1}_{(0,1)}$. And similarly $$ \begin{align*} E[Y^2]&=\int_{\Omega }Y^2(\omega )dP(\omega )\\ &=\int_{X^{-1}(-\infty ,1/2]}dP(\omega )+\int_{X^{-1}(1/2,\infty )}2^2dP(\omega )\\ &=\int_{(-\infty ,1/2]}f_X(t)\,\mathrm d t+4\int_{(1/2,\infty )}f_X(t)\,\mathrm d t\\ &=\int_{(0,1/2)}\,\mathrm d t+4\int_{(1/2,1)}\,\mathrm d t\\ &=\frac{5}{2} \end{align*} $$

1
On

$P\{\omega: Y(\omega) = 1\} = P\{\omega: X(\omega) \le 0.5\} = {1\over 2}$, likewise $P\{\omega: Y(\omega) = 2\} = {1\over 2}$. So $E(Y) = {1\over 2} \times 1 + {1\over 2}\times 2 = 1.5$, and $$ Var(Y) = \sum P\{Y = x\}\times (x - E(Y))^2 \\ = {1\over 2} \times 0.5^2 \times 2 = 0.25 $$

0
On

Phrasing things as you might see them in an Introductory Probability Theory class:

Note that the probability density function of $X$, $f_X(x)$, equals 1 over its support, $[0,1]$. Then:

$$E(Y) = \int_0^1 y(x)*f_X(x) dx = \int_0^{.5} y(x)*f_X(x) dx + \int_{.5}^1 y(x)*f_X(x) dx \\= \int_0^{.5} 1*1 dx + \int_{.5}^1 2*1 dx = \frac{1}{2} + 1 = \frac{3}{2}$$

$$E(Y^2) = \int_0^1 y(x)^2*f_X(x) dx = \int_0^{.5} y(x)^2*f_X(x) dx + \int_{.5}^1 y(x)^2*f_X(x) dx \\ = \int_0^{.5} 1^2*1dx + \int_{.5}^1 2^2*1dx = \frac{1}{2} + 2 = \frac{5}{2}$$

then $V(Y) = E(Y^2) - (E(Y))^2 = \frac{5}{2} - {(\frac{3}{2})}^2 = \frac{1}{4}$

Anytime you have a function of a random variable: $Z = g(X)$, $Z$ is also a random variable. If we let $f_X(x)$ denote the probability density function of $X$, $E(Z) = \int_{-\infty}^{\infty} g(x)*f_X(x) dx$.