Convergence in $L^1$ of ratio of random variables

1.1k Views Asked by At

If the sequences ${X_n}$ and $Y_n$ converge both almost surely and in $L^1$ to the constants $a$ and $b \ne 0$, respectively, and $Y_n$ is deterministically bounded below by some constant $0 < c \le b$, does this imply that $\mathbb{E}\left[\frac{X_n}{Y_n}\right]$ converges to $\frac{a}{b}$, i.e. does the ratio of $X_n$ and $Y_n$ converge in $L^1$ to $\frac{a}{b}$? [Follow-up: does $\frac{X_n}{Y_n}$ converge almost surely to $\frac{a}{b}$? Edit: Yes, the latter is a trivial consequence of the continuous mapping theorem.]

Of course, $\frac{X_n}{Y_n}$ converges in probability to $\frac{a}{b}$, but this does not imply convergence in $L^1$.

Intuitively, I think that it should indeed converge in $L^1$, but I am having a tough time taking a crack at this. I've tried using indicator variables, etc., but that does not seem to be the right approach.

I would love to know whether this is true, at least for "nice" sequences of $X_n$'s and $Y_n$'s [in particular, products and sums of independent Gaussians and finite integer higher powers of independent Gaussians]. I don't need the proof as long as someone can point me to a relevant resource, but if a proof is available I would of course love to see that as well.

2

There are 2 best solutions below

5
On BEST ANSWER

Recall that convergence in L1 implies convergence in probability. The bounded convergence theorem says that if $Z_n$ converges to a constant $z$ in probability, and if there is a finite constant $M$ such that $|Z_n|\leq M$ for all $n$, then $E[|Z_n-z|]\rightarrow 0$ and $E[Z_n]\rightarrow z$. (You indeed prove this via the law of total expectation.)

Claim:

Let $\{X_n\}_{n=1}^{\infty}, \{Y_n\}_{n=1}^{\infty}$ be such that $X_n$ converges to a constant $a$ in L1 (and hence in probability); $Y_n\geq 1$ for all $n$, and $Y_n$ converges in probability to a constant $b\geq 1$. Then $E[\frac{X_n}{Y_n}]\rightarrow a/b$.

Proof:

For all $n \in \{1, 2, 3, ...\}$ we have \begin{align} \frac{X_n}{Y_n} = \frac{a}{Y_n} + \frac{X_n-a}{Y_n} \implies E\left[\frac{X_n}{Y_n}\right] = E\left[\frac{a}{Y_n}\right] + E\left[\frac{X_n-a}{Y_n}\right] \end{align} Since a $|a/Y_n| \leq |a|$ for all $n$, and $a/Y_n$ converges to $a/b$ in probability, we get by the bounded convergence theorem that $E[a/Y_n]\rightarrow a/b$. Hence, it remains only to prove $E[(X_n-a)/Y_n]\rightarrow 0$. But this is true since:

$$ |E[(X_n-a)/Y_n]|\leq E[|X_n-a|/Y_n] \leq E[|X_n-a|] \rightarrow 0 $$ where the final limit holds because $X_n$ converges to $a$ in L1. $\Box$

1
On

This is false. Consider $(0,1)$ with Lebesgue measure and take $X_n \equiv 1$, $Y_n=\frac 1 {n^{2}}$ on $(0,\frac 1 n)$, $Y_n =1$ on $[\frac 1 n, 1)$. Then $X_n \to 1$ and $Y_n \to 1$ almost surely. But $E\frac {X_n} {Y_n} \geq \int_0^{1/n} n^{2}dx=n \to \infty$ Note that $E|Y_n -1| \leq \int _0 ^{1/n} |\frac 1 {n^{2}} -1|dx \leq \int _0 ^{1/n} 2dx \to 0$ so $Y_n \to 1$ in $L^{1}$.