Under which conditions is the limit of the mean equal to the limit of the mean of the last elements of a sequence?

87 Views Asked by At

Let $a_i$ be a sequence of random reals (a discrete time random process). And let us define the following two limits:

$$ A= \lim_{n \to \infty} \frac1n\sum_{i=1}^n a_i $$ $$ B= \lim_{k \to \infty} \lim_{n \to \infty} \frac1k\sum_{i=n-k+1}^n a_i$$

What assumptions do we need on $a_i$ to guarantee that if $A$ exists then $A=B$? Does it follow from stationarity (first-order, second-order, $n$th-order, wide-sense and strict-sense) or ergodicity (or anything else) is also needed?

Is it possible to construct such a sequence (either deterministic or random) that $B$ exists but $A$ does not?