I have what seems like a simple question, but I am not quite able to prove it. Suppose that $A_n$ is a sequence of random variables, and we know that $A_n$ strongly converges to zero; i.e. almost surely. I am aware that this implies that $A_n$ converges in probability to $0$, but I am wondering about the rate of convergence.
Suppose that I wish to establish that $A_n = \mathcal O_P(f(n))$, where $f(n)$ is some positive, strictly monotonically decreasing function which decays to zero at some rate. My understanding is that strong convergence implies that for any $\epsilon$ there are finitely many $n$ such that $A_n > \epsilon$. But does this condition imply that there is some constant $C_f$, potentially quite large!, such that for $n>>0$, $$\Pr\left(A_n > C_f f(n)\right) \to 0 ?$$
Expanding on @saz's comment about the rate of decay, for any function $f(n)$ that you pick, I can find a function $g(n)$ which still decays to $0$, but which does so at a much slower rate. In other words, we want $g(n) >> Cf(n)$ for any constant $C$ and large enough $n$, or, equivalently, $\lim_{n \to \infty} \frac{g(n)}{f(n)} = \infty$. Using this $g(n)$, I can construct a specific counterexample as follows.
Let $\Omega$ be a probability space with only one outcome, $\omega$ (for instance, flipping a trick coin that always comes up heads). Then, let $A_n(\omega) = g(n)$ for all $n$. Saying that $A_n$ converges to $0$ almost surely in this simple probability space is the same as saying that $g(n) \to 0$, which it does. However, $\mathrm{Pr}(A_n > Cf(n))$ is equal to $1$ for an infinite number of $n$, no matter what $C$ is, so $\mathrm{Pr}(A_n > Cf(n)) \not\to 0$.