Convergence in Probability with respect to expectation and variance

731 Views Asked by At

Here, $({X_n})_{n\geq1}$ is a sequence of random variables .

Result : $$ {X_n} \rightarrow X $$ if $E(X_n) \rightarrow E(X)$ and $ Var(X_n)\rightarrow Var(X)$ as $ n\rightarrow \infty $ .

Take any $ \varepsilon > 0.$

$$ \implies P[ | X_n-X| > \varepsilon ] \\ \\ \implies P [ | X_n -X|^2> \varepsilon^2 ] \leq \frac{E(X_n-X)^2}{\varepsilon^2} \\ \\ = \frac{[E(X_n)-E(X)]^2+ E[(X_n-E(X_n)]^2-E[(X-E(X)]^2}{\varepsilon^2 } . $$

I am not able to understand the last step .

I tried many ways but couldn't reach the above equation .

3

There are 3 best solutions below

4
On BEST ANSWER

First, note that for any r.v.s $X,Y\in L_2$, $$ \mathsf{E}[X-Y]^2=(\mathsf{E}[X-Y])^2+\operatorname{Var}(X-Y). $$ Then the equality you're interested in is incorrect in general because $$ \operatorname{Var}(X-Y)\ne \operatorname{Var}(X)-\operatorname{Var}(Y) $$ unless $\operatorname{Var}(Y)=\operatorname{Cov}(X,Y)$. Take, for example, $X\sim N(0,1)$ and $Y=2X$. Then $$ 1=\operatorname{Var}(X-Y)\ne \operatorname{Var}(X)-\operatorname{Var}(Y)=-3. $$


Since $$ \mathsf{E}[X-X_n]^2=(\mathsf{E}[X_n-X])^2+\operatorname{Var}(X_n)+\operatorname{Var}(X)-2\operatorname{Cov}(X_n,X), $$ convergence in prob. follows if, in addition, $\operatorname{Cov}(X_n,X)\to \operatorname{Var}(X)$.

0
On

I read the question as follows:

$EX_n \to EX$ and $var (X_n) \to var (X)$ implkies $X_n \to X$ in probability.

If this is what you are stating then it is false. Let $X$ have a standard normal distribution and take $X_n=-X$ for all $n$ to get a counterexample.

5
On

Basically he rewrites the RHS of the markov inequality using the variance terms thus to show that if you have convergence in $E(\cdot)$ and in $Var(\cdot)$ then the whole RHS of Markov inequality converges to $0$ for any choice of $\epsilon>0$ . In particular he uses the artificial expansion (only adding and subtracting $E(X_n)$ ):

$\ E[(X_n - X)^2]=E[((X_n - E(X_n))+(E(X_n)-X))^2]$,

Which then expanded gives the desired equality.

To use convergence in variance and quadratic mean you should use the following facts,

$E(X_n -E(X_n))^2=Var(X_n)$

$E(X -E(X))^2=Var(X)$

Then use

$E(E(X))=E(X)$

Finally if you let $n\rightarrow \infty \,,$ then you have convergence for any choice of $\epsilon$ .