expectation larger than limsup for sequence of random variables

322 Views Asked by At

Let's say we are given a sequence of real-valued random variables, $X_n, n\geq1$, subject to the following conditions:

a) $X_n\geq0$ P-a.s.

b) $\limsup_{n\to\infty} X_n=C>0$ P-a.s., where C is a constant. In fact, we can assume that any given $c\leq C$ is a limit point of $X_n$ with probability either 0 or 1.

c) $\mathbb{E}[X_n]\geq C$ for all $n\geq 1$, although this expectation might be infinite.

Can we deduce that $X_n$, in fact, converges to $C$ P-a.s. as $n\to\infty$? Or, equivalently, is it possible for this sequence to have a limit inferior strictly smaller than C?

Intuitively, it appears to me that such a sequence should in fact converge, but I can't seem to find a good line of argument for that (and of course, I just might be wrong and simply did not see a counterexample).

A bit of background: This question originated from thinking about some asymptotic properties of stochastic processes, so the conditions above are what I could extract so far.

1

There are 1 best solutions below

0
On BEST ANSWER

Here's a construction of an explicit example based on Michael's idea: Let $G$ be a geometrically distributed random variable with parameter $p=1/2$. Define $$ X_n:=\begin{cases}2^nC, &\text{ for } n\leq G\\ 0, &\text{ for } n>G\ \text{even}\\ C, &\text{ for } n>G\ \text{odd} \end{cases}$$ Conditions a) and b) are satisfied by construction, the sequence does not converge, and for c), we calculate $$ \mathbb{E}[X_n]\geq 2^nC\cdot\mathbb{P}[G\geq n]=2^nC\sum_{k\geq n}\frac{1}{2}\left(1-\frac{1}{2}\right)^{k-1}=C\sum_{k\geq0}\frac{1}{2^k}=2C.$$