Rate of convergence in distribution

389 Views Asked by At

Imagine a sequence of random variables $\{X_n\}_{n=1}^\infty$ in [0,1] converging pointwise (i.e, sure convergence) to the random variable $X$, with convergence factor $\lambda$: $$ |X_n(\omega)-X(\omega)|=O(\lambda^n),\quad\text{for all }\omega\in\Omega=[0,1] $$ It is also clear that sure convergence implies convergence in distribution: $X_n\overset{\mathcal D}{\rightarrow}X$.

Question: can the convergence factor of $\,\overset{\mathcal D}{\rightarrow}\,$ be different than $\lambda$? My rough intuition is that convergence in distribution could potentially be orders of magnitude faster, being convergence in distribution less demanding than sure convergence.

I found out that the Wasserstein metric $W_1$ metrizes weak convergence in our particular case (because our space is bounded in $\mathbb R$), so $W_1$ can be used in this context as a metric to analyze weak convergence. Moreover, since we are in $\mathbb R$, $W_1$ can be written as:

$$ W_1(P_n,P)=\int_0^1|F_n(x)-F(x)|dx, $$

where $\{P_n\}_{n=1}^\infty$ and $P$ are corresponding probability measures.

So my question is whether there is a possibility to obtain something like

$$ W_1(P_n,P)=o(\lambda^n) $$

meaning that convergence in distribution is much faster than sure convergence in this context.

I have thought of using the transformation

$$ W_1(P_n,P)=\int_0^1|F^{-1}_n(y)-F^{-1}(y)|dy, $$ by defining $F^{-1}$ suitably, because this formula involves distances in our domain $[0,1]$, which are bounded by $\lambda^n$.

Thanks.