Almost sure convergence implies convergence in probability with arbitrary rate (up to constant)?

969 Views Asked by At

I have what seems like a simple question, but I am not quite able to prove it. Suppose that $A_n$ is a sequence of random variables, and we know that $A_n$ strongly converges to zero; i.e. almost surely. I am aware that this implies that $A_n$ converges in probability to $0$, but I am wondering about the rate of convergence.

Suppose that I wish to establish that $A_n = \mathcal O_P(f(n))$, where $f(n)$ is some positive, strictly monotonically decreasing function which decays to zero at some rate. My understanding is that strong convergence implies that for any $\epsilon$ there are finitely many $n$ such that $A_n > \epsilon$. But does this condition imply that there is some constant $C_f$, potentially quite large!, such that for $n>>0$, $$\Pr\left(A_n > C_f f(n)\right) \to 0 ?$$

2

There are 2 best solutions below

0
On BEST ANSWER

Expanding on @saz's comment about the rate of decay, for any function $f(n)$ that you pick, I can find a function $g(n)$ which still decays to $0$, but which does so at a much slower rate. In other words, we want $g(n) >> Cf(n)$ for any constant $C$ and large enough $n$, or, equivalently, $\lim_{n \to \infty} \frac{g(n)}{f(n)} = \infty$. Using this $g(n)$, I can construct a specific counterexample as follows.

Let $\Omega$ be a probability space with only one outcome, $\omega$ (for instance, flipping a trick coin that always comes up heads). Then, let $A_n(\omega) = g(n)$ for all $n$. Saying that $A_n$ converges to $0$ almost surely in this simple probability space is the same as saying that $g(n) \to 0$, which it does. However, $\mathrm{Pr}(A_n > Cf(n))$ is equal to $1$ for an infinite number of $n$, no matter what $C$ is, so $\mathrm{Pr}(A_n > Cf(n)) \not\to 0$.

3
On

Given a sequence $(A_n)_{n \in \mathbb{N}}$ converging almost surely to $0$, it is possible to prove the existence of a function $f$ with the desired properties (see below), but without further information on the distribution of the random variables (e.g. tail/moment estimates) we cannot expect to pin down the rate of decay of the function $f$.

Existence of $f$: Since $A_n \to 0$ almost surely, we have, in particular,

$$\mathbb{P} \left( |A_n|> \frac{1}{k} \right) \xrightarrow[]{n \to \infty} 0$$

for any $k \in \mathbb{N}$, and therefore we can choose $n_k \in \mathbb{N}$ such that

$$\mathbb{P} \left( |A_n| > \frac{1}{k} \right) \leq \frac{1}{k} \quad \text{for all $n \geq n_k$.} \tag{1}$$

Without loss of generality we may assume that $0=n_1 < n_2 < \ldots$. If we define

$$f(n) := \frac{1}{k} \qquad \text{for $n \in [n_k,n_{k+1})$, $k \in \mathbb{N}$}$$

then $f>0$ is monotonically decreasing and, by $(1)$,

$$\mathbb{P}(|A_n|>f(n)) \xrightarrow[]{n \to \infty} 0.$$

Remarks:

  • Clearly, $f$ is, in general, not stricly monotone, but you can easily modify the construction to obtain a strictly monone function (e.g. $\tilde{f}(n) := f(n) + 1/n$ does the job).
  • It suffices to have $A_n \to 0$ in probability (the almost sure convergence is not needed for the proof).