Expectation value and variance of random variables

213 Views Asked by At

Let $(X,F,\mu)$ be a probability space. Let $f:X\rightarrow [0,\infty)$ be a random variable (so measurable). The integral $$E=\int_X f \, d\mu$$ is the expectation value of $f$ and $$V=\int_X (f-E)^2 \, d\mu$$ is the variance of $f$.

  1. Show that, if the variance of $f$ is small, $f$ deviates from its expectation value with very small probability. Explicitly, show that the probability that $f$ deviates by $\varepsilon$ from $E$ ($\mu(\{x\in X:|f(x)-E|>\varepsilon\})$) is less than or equal to $\frac{V}{\varepsilon^2}$.

  2. Consider the "random" series $1\pm \frac{1}{2}\pm\frac{1}{4}\pm\frac{1}{8}\pm\cdots$ with the assignment of a $+$ or $-$ in the $n$th term decided by the toss of a coin. Compute its expectation value and variance. (Hint: first show that $2t-1=\sum\limits_{k=1}^\infty \frac{R_k(t)}{2^k}$)

Here's what I have so far:

  1. By Chebyshev's inequality, $\mu(\{x\in X:|f(x)-E|>\varepsilon\})\leq\frac{1}{\varepsilon}\int_X|f(x)-E| \, d\mu$. Edit: then $\mu(\{x\in X:(f(x)-E)^2>\varepsilon^2\})\leq\frac{1}{\varepsilon^2}\int_X(f(x)-E)^2 \, d\mu$.

  2. I'm not sure how to prove the statement in the hint, but given that, we can let $f=2t$, then define simple functions $s_1,s_2$ such that $s_1\leq f\leq s_2$. Somehow I need to choose these functions so their expectation value and variance will be the same, but I don't know how to do that.

4

There are 4 best solutions below

0
On

By Markov inequality $\Bbb{P}(h(X)\geq c) \leq \frac{\Bbb{E}[h(X)]}{h(c)}$ for a random variable $X$ and a non-negative, monotone, measureable function $h$ you will get the first goal.

1
On

Suitably invoking Chebyshev's inequality, and possibly proving it, is probably all that is expected in $\#1.$

In $\#2,$ you have $$ \operatorname E(X) = \operatorname E(1) + \operatorname E\left(\pm \frac 1 2\right) + \operatorname E\left( \pm \frac 1 4 \right) + \operatorname E \left( \pm \frac 1 8 \right) + \cdots $$ To justify the claim that the expected value of the sum equals the sum of the expected values, you might need to invoke absolute convergence.

0
On

For the expectation of the series, the previous answer already showed it to be equal to $1$ by using linearity of expectation. A relatively rapid way to determine the variance of the series is to note that variance is linear if we consider independent variables, so we have

$$\operatorname{Var}\left[1 \pm \frac{1}{2} \pm \frac{1}{4}\pm \cdots \pm \frac{1}{2^n} \right] = \operatorname{Var}(1)+\sum_{k=1}^n \operatorname{Var} \left( \pm \frac{1}{2^k}\right) = \sum_{k=1}^n \frac{1}{4^k} =\frac{1}{3} $$

To check this, I performed a simulation with $10^5$ trials (in each of them, the series wa calculated until the $50^\text{th}$ term) and got a value of $\approx 0.334\ldots\,,$ in good accordance with the predicted value.

0
On

Not sure what your hint is supposed to mean (are you sure it is written correctly?). An interpretation is the following:

If you let $s_k: X\rightarrow \{\pm 1\}$, $k\geq 1$ denote a sequence of i.i.d equi-distributed random vars then : $$ t = 1 + \sum_{k\geq 1} s_k 2^{-k} $$ verifies: $$2t -1 = s_1 + \left(1 + \sum_{k\geq 1} s_{k+1} 2^{-k}\right) = s_1 + t' $$ with $t'$ a random var following the same law as $t$. As $t$ (and $t'$) are bounded random vars (so has a finite variance) and $t'$ is independent from $s_1$ you get: $$4 {\rm Var} (t) = {\rm Var} (2t-1) = {\rm Var}(s_1) + {\rm Var} (t') = 1 + {\rm Var} (t)$$ from which ${\rm Var}(t) = 1/3$. But the direct calculation by Anatole seems quite a lot easier.