Independence of random sequences

46 Views Asked by At

Let $X_n$ and $Y_n$ be two random sequences. I'm trying to proof that if both of them converge in mean square ($i.e$ $\lim_{n\to ∞} E((X_n - X)^2 = 0$ and $\lim_{n\to ∞}E((Y_n - Y)^2 = 0$), then $aX_n + bY_n$ also converges in mean square for $a, b$ some constants.

For this, I started from the definition of mean square convergence for $aX_n + bY_n$, and I arrive at a point where the only term in my limit is : $$ab * \lim_{n\to ∞} E((X_n - X)(Y_n - Y))$$

I need to prove that this is equal to $0$.

Here is my question, can I use the fact that $X_n$ and $Y_n$ are random sequences to say that they are independent so that I could split the expectation as follows : $$E((X_n - X)(Y_n - Y)) = E(X_n - X) * E(Y_n - Y)$$

Then I would want to continue like this : $$E(X_n - X) \le E((X_n - X)^2)$$ As $$\lim_{n\to ∞} E((X_n - X)^2) = 0$$ We have $$\lim_{n\to ∞} E(X_n - X) = 0$$ Same reasoning for $\lim_{n\to ∞}E(Y_n - Y)$.

Conclusion : $$ab * \lim_{n\to ∞} E((X_n - X)(Y_n - Y)) = 0$$

2

There are 2 best solutions below

0
On BEST ANSWER

To be most general, what you want is the Minkowski Inequality.

You see that the space of square integrable random variables from a probability space is actually a Banach Space(and also a Hilbert Space).

In particular, one defines the $L^{p}$ norm for a random variable $X$ by $||X||_{L^{p}}=\bigg(E(|X|^{p})\bigg)^{\frac{1}{p}}$ for all $1\leq p<\infty$ and we say that $X\in L^{p}$. (obviously this is only relevant for random variables $X$ with finite $p-th$ moment.

Equipped with this, one has the Minkowski Inequality which states that for $X,Y\in L^{p}$, $||X+Y||_{L^{p}}\leq ||X||_{L^{p}}+||Y||_{L^{p}}$ . i.e. this is nothing but the triangle inequality but in the context of the normed vector space.

So in particular, you have $$||aX_{n}+bY_{n}-aX-bY||_{L^{2}}\leq ||a(X_{n}-X)||_{L^{2}}+||b(Y_{n}-Y)||_{L^{2}}$$

Now by assumption, you have that $||X_{n}-X||_{L^{2}}$ and $||Y_{n}-Y||_{L^{2}}$ both go to $0$ as $n\to\infty$.

Thus $||aX_{n}+bY_{n}-ax-bY||_{L^{2}}$ tends to $0$.

Hence $aX_{n}+bY_{n}$ converges in mean square to $aX+bY$.

What the other answer shows is valid in the Hilbert Space $L^{2}$ due to the Cauchy-Schwartz inequality. But in fact, by the Minkowski Inequality above, you have that if $E(|X_{n}-X|^{p})\to 0$ and $E(|Y_{n}-Y|^{p})\to 0$ , then $|aX_{n}+bY_{n}-aX-bY|^{p}\to 0$

0
On

Since $\mathbb{E}[(aX_n-aX)^2]=a^2\mathbb{E}[(X_n-X)^2]$ so $X_n\to X$ in $L^2$ immediately implies that $aX_n\to aX$ in $L^2$; a similar statement holds for $Y_n$. Therefore, w.l.o.g. we can assume that $a=1$, $b=1$.

We have $$ (*)\qquad \mathbb{E}[\left((X_n+Y_n)-(X+Y)\right)^2]=\mathbb{E}[\left((X_n-X)+(Y_n-Y)\right)^2]=\mathbb{E}[(X_n-X)^2]+ \mathbb{E}[(Y_n-Y)^2]+2\mathbb{E}[(X_n-X)(Y_n-Y)].$$ By Cauchy–Bunyakovsky–Schwarz inequality, $$ |\mathbb{E}[(X_n-X)(Y_n-Y)]|\le \sqrt{\mathbb{E}[(X_n-X)^2]\times\mathbb{E}[(Y_n-Y)^2]}, $$ so all three summands in the RHS of (*) go to zero as $n\to\infty$. Thus $X_n+Y_n\to X+Y$ in $L^2$.