Stable laws of probabilty and convergence.

63 Views Asked by At

I'm working on a probabilty exercise and i'm stuck at some point. I did the first 3 questions without trouble. Here is what is says :

We consider $(\Omega,\mathcal{A},\mathbb{P}) $ a probability space. For $\alpha \in ]0,2]$ we say that a real valued random variable follows a symetrical law $\alpha$-stable of scale parameter $s>0$ written $X \sim S\alpha S(s) $ if

\begin{align} \forall t\in \mathbb{R}, \phi_X(t)=\mathbb{E}(e^{itX})=e^{-s^{\alpha}|t|^{\alpha}} \end{align}

I showed that such an $X$ is symetrical and that if $X$ has a moment of order 1 then $\mathbb{E}(X)=0$. Then we showed that for $\alpha=2$ and if $X \sim \mathcal{N}(0,\sigma^2) $ for $\sigma>0$ then $X \sim S\alpha S(\frac{\sigma}{\sqrt{2}}) $ and, if $X\sim \mathcal{C}(1)$, for $s>0$, $sX \sim S\alpha S(s)$.

Let $(X_n)_{n\in\mathbb{N}}$ be a sequence of independant r.v. of law $S\alpha S(s)$ with $\alpha\in ]0,2]$ and $s>0$. Let $S_n=\sum\limits_{k=1}^nX_k$ for $n\geq1$.

We had to show that $S_n \sim S\alpha S(n^{\frac{1}{\alpha}}s)$, for all $n\geq1$.

What i did for this is : \begin{align} \forall t\in \mathbb{R}, \phi_{S_n}(t)&= \mathbb{E}(e^{itS_n)} \\ &=\mathbb{E}(e^{it\sum\limits_{k=1}^nX_k})\\ &=\prod\limits_{k=1}^n\mathbb{E}(e^{itX_k}) \textbf{ because the $(X_n)_{n\in\mathbb{N}}$ are independant.} \\ &=\prod\limits_{k=1}^n e^{-s^{\alpha}|t|^{\alpha}} \\ &=(e^{-s^{\alpha}|t|^{\alpha}})^n \\ &=e^{-ns^{\alpha}|t|^{\alpha}} \\ &=e^{-(n^{\frac{1}{\alpha}})^{\alpha}s^{\alpha}|t|^{\alpha}} \\ &=e^{-(n^{\frac{1}{\alpha}}s)^{\alpha}|t|^{\alpha}} \end{align}

We recognized the characteristic function of a r.v $X \sim S\alpha S(n^{\frac{1}{\alpha}}s)$ so $S \sim S\alpha S(n^{\frac{1}{\alpha}}s)$.

The next question is where i am stuck

Verify that for all $n\in\mathbb{N^*}$, $\frac{S_n}{n^{\frac{1}{\alpha}}}$ has same law than $X_1$ and deduce that $(\frac{S_n}{n^{\frac{1}{\alpha}}})_n$ convergences as a law to $X_1$

I tried using the characteristic function but i do not know the law of $\frac{X_i}{n^{}\frac{1}{\alpha}}$ but i was wondering if it had the same law as $X_i$ for all $n\geq i\geq1$. As for the convergence i wanted to use the definition by taking $f$ a bounded function from $\mathbb{R}^d $ to $ \mathbb{R}$ and show that $\lim\limits_{n \to \infty}\mathbb{E}(f(\frac{S_n}{n^{\frac{1}{\alpha}}}))=\mathbb{E}(f(X_1))$ is this a good idea ?

Then the next question is :

We suppose that $\mathbb{E}(|X_1|)<\infty$

i) What can we say of the convergence of $(\frac{S_n}{n})_n$ ?

We have,

$ \forall n \in \mathbb{N}, \frac{S_n}{n} \sim X_1$ for $\alpha=1$ and since $X_1$ is integrable the strong law of large numers gives us, $\frac{S_n}{n}\underset{n\to+\infty}{\longrightarrow}\mathbb{E}[X_1]$ a.s.

ii) Then what will be the limit of $(\frac{S_n}{n^{\frac{1}{\alpha}}})_n$ when $\alpha \leq1$

For this one i don't really know i think the limit will be zero for $\alpha \leq 1$ which is contradictory with the case $\alpha=1$. Then we have an absurdity and we conclude that $\mathbb{E}(|X_1|)<\infty$.

The last part is

We suppose $\alpha \in ]1,2]$ and $\mathbb{E}(X_1^2)< \infty$

i)What can we can we say of the convergence of $(\frac{S_n}{\sqrt{n}})_n$.

I wanna use the central limit theorem but i do not know $Var(X_1)$ but i know that it exists by hypothesis. If i could prove that $Var(X_1)=n^2 $ then $\sqrt{Var(X_1)}=n $ so that we have $\frac{\sqrt{n}}{\sqrt{n^2}}=\frac{\sqrt{n}}{n}=\frac{1}{\sqrt{n}}$.

Then for $\alpha=2$, $\forall n \in \mathbb{N}, \frac{S_n}{\sqrt{n}}$ has same law (for all $\alpha \in ]0,2]$ we showed it) than $ X_1 \sim S\alpha S(s)$. We have that $X_1$ is square integrable and (X_n)_n are independant. We also have $\mathbb{E}(X_1)=0$.

The central limit theorem then gives, \begin{align} (\frac{S_n}{\sqrt{n}})\underset{n\to+\infty}{\overset{\mathcal{L}}{\longrightarrow}} \mathcal{Z} \textbf{ Where $\mathcal{Z} \sim \mathcal{N}(0,1)$} \end{align}

The next question is :

ii. Verify that for all $t\in \mathbb{R}, \mathbb{P}(\frac{S_n}{\sqrt{n}} \leq t)=\mathbb{P}(X_1 \leq tn^{-\frac{1}{\alpha} - \frac{1}{2}})$

This i did easily by splitting the bottom half and using the fact that $\frac{S_n}{n^{\frac{1}{\alpha}}}$ has same law as $X_1$

And finally (at last !),

iii. Deduce that $\mathbb{E}(X_1^2)=\infty$ if $\alpha \in [1,2[$.

Unsure about what i should do here, use the fact that it converges as a law to a standard gaussian distribution ?

Thank you very much for your time.

1

There are 1 best solutions below

0
On

The characteristic function of $\frac{S_n}{n^{\frac{1}{\alpha}}}$ at $t$ is simply $\phi_{S_n}\left(t/n^{\alpha}\right)$ so you can use your previous computation with $t$ replaced by $t/n^{\alpha}$. Then you can use the fact that if $Y_n$ has the same law as $Y$, then $Y_n$ converges to $Y$ in distribution.

Then what will be the limit of $(\frac{S_n}{n^{\frac{1}{\alpha}}})_n$ when $\alpha\leqslant 1$?

Use the fact that $1/\alpha\geqslant 1$; if the inequality is strict the limit is $0$.

For the last part, the variance of $X_1$ cannot depend on $n$. By the observation made at the beginning, $\operatorname{Var}\left(X_1\right)=\mathbb E\left[X_1^2\right]>0$ hence \begin{align} \left(\frac{S_n}{\sqrt{n}}\right)\underset{n\to+\infty}{\overset{\mathcal{L}}{\longrightarrow}} \mathcal{Z} \textbf{ Where $\mathcal{Z} \sim \mathcal{N}\left(0,\mathbb E\left[X_1^2\right] \right)$} \end{align} Considering the equality $$ \mathbb{P}\left(\frac{S_n}{\sqrt{n}} \leq t\right)=\mathbb{P}(X_1 \leq tn^{-\frac{1}{\alpha} + \frac{1}{2}})$$ and let $n$ going to infinity for a fixed $t>0$ we get $$ \mathbb P\left(N\cdot \sqrt{\mathbb E\left[X_1^2\right]}\leqslant t\right) $$ for the left hand side, while the right hand side takes the value $0$, $1/2$ or $1$ according to the case where $t$ is positive or not, or zero. This is a contradiction.