Let $a_i$ be a sequence of random reals (a discrete time random process). And let us define the following two limits:
$$ A= \lim_{n \to \infty} \frac1n\sum_{i=1}^n a_i $$ $$ B= \lim_{k \to \infty} \lim_{n \to \infty} \frac1k\sum_{i=n-k+1}^n a_i$$
What assumptions do we need on $a_i$ to guarantee that if $A$ exists then $A=B$? Does it follow from stationarity (first-order, second-order, $n$th-order, wide-sense and strict-sense) or ergodicity (or anything else) is also needed?
Is it possible to construct such a sequence (either deterministic or random) that $B$ exists but $A$ does not?