Almost sure convergence in parameters preserves convergence in distribution

120 Views Asked by At

Let $X_n$, $n\in\mathbb{N}$ denote a sequence of real-valued random variables that converges in distribution to the standard normal distribution. In addition, each $X_n(c)$ is a function of a real-valued parameter $c$. I.e. $X_n(c) \stackrel{d}{\to}\mathcal{N}(0,1)$ as $n\to\infty$.

Now, assume $c$ is unkown but the sequence $c_n$, $n\in\mathbb{N}$ converges almost sure to $c$.

What are sufficient conditions for the functions $X_n(c)$, such that $X_n(c_n) \stackrel{d}{\to}\mathcal{N}(0,1)$ as $n\to\infty$? (As stated above, at the limit point $c$ the sequence $X_n(c)$ also converges in distribution to $\mathcal{N}(0,1)$.)

This statement cannot be true in general. But, I assume it holds if $X_n$ are continuous functions of $c$. Does it?

2

There are 2 best solutions below

3
On

[EDIT] Here is a counterexample for $X_n:(0,\infty)\rightarrow\mathbb{R}$ continuous. (This doesn't seem to work for $X_n$ defined over all of $\mathbb{R}$, as you commented).

Let $N\sim \mathcal{N}(0,1)$ and define: $$ X_n(c):=\begin{cases} N, & \text{if }n>c^{-1}+1,\\ N(n-c^{-1}), & \text{if }c^{-1}\leq n\leq c^{-1}+1,\\ 0, & \text{else.} \end{cases} $$

In other words, $c\mapsto X_n(c)$ is $0$ for small $n$ but normally distributed for large $n$. In between, the function linearly interpolates - hence continuity of $c\mapsto X_n(c)$. (You could even find a polynomial interpolation in between with Weierstrass, then you have a smooth counter-example). Evidently, $X_n(n^{-1})=0$ for all $n\in\mathbb{N}$, which completes the counterexample.

I think these references might be helpful to find sufficient conditions: this and that.

8
On

Here is another counterexample, perhaps more convincing.

Let $B_t$ be the standard Brownian motion. Then, the following map is almost surely continuous: $$ X_n:\mathbb{R}\rightarrow\mathbb{R},\quad c\mapsto \frac{1}{\sqrt{e^c}}B_{e^c} $$ For all $c\in\mathbb{R}$, $X_n(x)\sim\mathcal{N}(0,1)$, by construction of the Brownian motion. Moreover, define the following sequence: $$ c_n:=\begin{cases} 0,&\text{if }B_1>0,\\ \ln (2),&\text{else.} \end{cases} $$ Trivially, this is an almost surely convergent sequence. Note that for all $n\in\mathbb{N}$: $$ \mathbb{P}[X_n(c_n)>0]=\mathbb{P}\left[\{B_1>0\}\cup\left\{\frac{1}{\sqrt{2}}B_{2}>0\right\}\right] $$ $$ =\mathbb{P}[B_1>0]+\mathbb{P}\left[\left\{\frac{1}{\sqrt{2}}B_{2}>0\right\}\setminus \{B_1>0\} \right] =\frac{1}{2}+\gamma $$ Where $\gamma>0$. So, $X_n(c_n)$ cannot have a normal distribution. As $X_n(c_n)$ is identical to its limit, it does not converge to a normally distributed random variable either.