Convergence in distribution and convergence of variance

154 Views Asked by At

Let $X_n \overset d \equiv N(0,\sigma^2_n)$ and $X_n \overset d \rightarrow X $. Further suppose that $\sigma^2_n \overset p \rightarrow \sigma ^2>0$. Then does is follow that $X \overset d \equiv N(0,\sigma^2)$?

If not, can you give a counterexample? What if $\sigma^2_n\rightarrow \sigma^2$ in L1 or a.s.?

Thanks and regards.

EDIT: Here $\sigma^2_n$ is a random variable and $\sigma ^2$ is a constant. They are dependent to $X_n$... Is this assumption crucial?

2

There are 2 best solutions below

1
On

I solved it.... $$(X_n,\sigma_n^2)\rightarrow ^d(X,\sigma^2)$$ Hence, $$\frac{1}{\sigma_n}X_n\rightarrow ^d\frac{1}{\sigma}X$$ and $$\frac{1}{\sigma_n}X_n\rightarrow ^dN(0,1)$$

0
On

I've adapted the proof from the classical result (which is true) in the case where your $\sigma_n$ are non random. We use the equivalence : convergence in distribution iff convergence of characteristic functions. We know from your assumption that :

$$ \mathbb{E}(e^{itX_n}) \rightarrow \mathbb{E}(e^{itX}).$$

Let's compute it in another way : conditionnaly to $\sigma_n$, you have : $$ \mathbb{E}(e^{itX_n}) = \mathbb{E}(\mathbb{E}(e^{itX_n}|\sigma_n)) = \mathbb{E}(e^{-\frac{\sigma_n^2t}{2}})$$

But then, since you have convergence in probability you can always extract a subsequence from your $\sigma_n$ converging almost surely (we will still note it $\sigma_n$) to $\sigma$, which is according to your hypothesis non random (this will be crucial). So by dominated convergence theorem :

$$\mathbb{E}(e^{itX_n}) \rightarrow \mathbb{E}(e^{-\frac{\sigma^2t}{2}}) = e^{-\frac{\sigma^2t}{2}} $$

which is the caracteristic function of a gaussian random variable with mean 0 and variance $\sigma$.