Necessary and sufficient condition for weak convergence of gamma distribution

413 Views Asked by At

Let $(X_n)_n$ be a sequence of random variable such that $f_{X_n}(x)=\frac{1}{\Gamma(\alpha_n)}\lambda_n^{\alpha_n}x^{\alpha_n-1}e^{-\lambda_nx}1_{]0,+\infty[}(x),$ where $\alpha_n>0,\lambda_n>0.$

  1. Suppose that $\alpha_n=1,\forall n \in \mathbb{N}.$ Find a necessary and sufficient condition on $(\lambda_n)_n$ such that $(X_n)_n$ converges in distribution.

  2. More generally, find a necessary and sufficient condition on $\alpha_n,\lambda_n$ so that $(X_n)_n$ converges in distribution.

The first part is easy, it converges in distribution if and only if $0<\liminf_n\lambda_n=\limsup_n\lambda_n$.

Concerning part 2), is it true that a condition of weak convergence is the convergence of $(\alpha_n)$ and $(\lambda_n)$?

4

There are 4 best solutions below

4
On

"Is it true that a condition of weak convergence is the convergence of $\alpha_n$ and $\lambda_n$?" The answer is negative.

Counterexample:

Suppose $\lambda_n \to +\infty$, $\alpha_n = \bar{o}(\lambda_n)$, $n \to \infty$, and let $X_n$ have distribution $\Gamma(\lambda_n, \alpha_n)$. We have

$$\mathbb{E} e^{i t X_n} = \Bigl(1 - \frac{it}{\lambda_n} \Bigr)^{-\alpha_n} = e^{-\frac{\alpha_n}{\lambda_n} \cdot \lambda_n \ln\bigl( 1 - \frac{it}{\lambda_n} \bigr)}.$$

Hence,

$$\lim_{n \to \infty} \mathbb{E} e^{i t X_n} = \lim_{n \to \infty} e^{-\bar{o}(1) (-it)(1+\bar{o}(1))} = 1 = \mathbb{E} e^{i t \cdot 0}, $$ that is $X_n \to 0$ in distribution.

So we may take, for example, $\lambda_n = n$ and $\alpha_n = 2 + (-1)^n$ for our counterexample.

If we suppose that $\alpha_n$ and $\lambda_n$ are separated from zero and $\infty$, that is $0 < c \le \alpha_n, \lambda_n \le C < \infty$, than convergence of $X_n$ is equaivalent to pointwise convergence of functions $\Bigl(1 - \frac{it}{\lambda_n} \Bigr)^{-\alpha_n} $ for each $t \in \mathbb{R}$. In this case we may show that "there is convergence in distribution" iff "$\alpha_n$ and $\lambda_n$ are convergent" (and in this case $lim_n X_n \sim \Gamma(\lim_n \lambda_n, \lim_n \alpha_n)$). The "if-part" follows immediately from the convergence of characteristic functions.

To prove the second part, let us take a convergent subsequence $\alpha_{n_k}$. We know that $X_n$ has a limit, hence $\exists \lim_n \Bigl(1 - \frac{it}{\lambda_n} \Bigr)^{\alpha_n}$. Put $t=1$. Sequences $\Bigl(1 - \frac{i}{\lambda_{n_k}} \Bigr)^{\alpha_{n_k}}$ and $\alpha_{n_k}$ are convergent. Hence, for some finite $\lambda, \alpha>0$ we have $\lambda_{n_k} \to \lambda$, $\alpha_{n_k} \to \alpha$ and hence $$\lim_n \Bigl(1 - \frac{it}{\lambda_n} \Bigr)^{\alpha_n} \to \Bigl(1 - \frac{it}{\lambda} \Bigr)^{\alpha}.$$

If there is another convergent subsequence $\alpha_{n_k^*} \to \alpha^*$, we have (as above): $\lambda_{n_k^*} \to \lambda^*$ and $$\lim_n \Bigl(1 - \frac{it}{\lambda_n} \Bigr)^{\alpha_n} \to \Bigl(1 - \frac{it}{\lambda^*} \Bigr)^{\alpha^*}.$$ Moreover, $\alpha^*, \lambda^* > 0$. We need to prove that $\alpha = \alpha^*$ and $\lambda = \lambda^*$.

It's easy to see that

$$\Bigl(1 - \frac{it}{\lambda} \Bigr)^{\alpha} = \Bigl(1 - \frac{it}{\lambda^*} \Bigr)^{\alpha^*}$$ for all $t \in \mathbb{R}$. Put $C = \frac{\alpha}{\alpha^*} > 0$. We have $\Bigl(1 - \frac{it}{\lambda} \Bigr)^{C} = 1 - \frac{it}{\lambda^*} $. Put $u(t) = 1 - \frac{it}{\lambda} $. We have: $u^C = $ linear function of $u$, $u \in \mathbb{R}$. Hence, $C=1$ and $\alpha = \alpha^*$.

Moreover, $\Bigl(1 - \frac{it}{\lambda} \Bigr)^{C} = 1 - \frac{it}{\lambda^*} $, so $\lambda = \lambda^*$. The proof is finished.

For case when $\alpha_n$ and $\lambda_n$ are not separated from $0$ both we have the following theorem.

Theorem 1.

If $\alpha_n \ge c > 0$ and $\lambda_n \to 0$ then $X_n$ diverges.

If $\alpha_n \to 0$ and $\lambda_n \ge c > 0$ then $X_n \overset{w}{\to} 0$.

If $\alpha_n \to 0$ and $\lambda_n \to 0$ then

$$X_n \overset{w}{\to} \Longleftrightarrow X_n \overset{w}{\to} 0 \Longleftrightarrow \lim_{n \to \infty} \alpha_n \ln \lambda_n = 0.$$

Let us prove the theorem. If $\xi \sim \Gamma( \lambda, \alpha)$ then \begin{gather} \Bigl( 1 - \frac{it}{\lambda} \Bigr)^{-\alpha} = e^{-\alpha \ln(1 - \frac{it}{\lambda} )} = e^{-\alpha \Bigl( \ln \sqrt{1^2 + \frac{t^2}{\lambda^2}} + i \cdot arg(-\frac{t}{\lambda}) \Bigr) } = \nonumber \\ = e^{ - \frac{\alpha}2 \ln \bigl(1 + \frac{t^2}{\lambda^2} \bigr)} \cdot e^{i \alpha arctg \frac{t}{\lambda} } = {\Bigl( 1 + \frac{t^2}{\lambda^2}\Bigr)}^{-\frac{\alpha}2} \cdot e^{i \alpha arctg \frac{t}{\lambda} }.\nonumber \end{gather} Hence $$Ee^{itX_n} = {\Bigl( 1 + \frac{t^2}{\lambda_n^2}\Bigr)}^{-\frac{\alpha_n}2} \cdot e^{i \alpha_n arctg \frac{t}{\lambda_n} }.$$

From Levy’s continuity theorem we have $$ \Bigl( X_n \overset{w}{\to} \Bigr) \Longleftrightarrow \Bigl( \forall t \in \mathbb{R} \ \ \exists \lim_n Ee^{itX_n} = \phi(t) \text{and } \phi(t) \text{ is continious at } 0 \Bigr).$$

If $\alpha_n \ge c > 0$ and $\lambda_n \to 0$ then $\lim_n |Ee^{itX_n}| = {\Bigl( 1 + \frac{t^2}{\lambda_n^2}\Bigr)}^{-\frac{\alpha_n}2} \to I_{t=0}$, where $I$ is an indicator function. If $X_n \overset{w}{\to} X$ then $\lim_n |Ee^{itX_n}| = |Ee^{itX}|$ is continuous (as it is the module of the characteristic function). But $I_{t=0}$ is not continuous. Hence $X_n$ diverges. The first statement of the theorem is proved.

Here and everywhere below we will assume that $\alpha_n \to 0$. Thus $\alpha_n arctg \frac{t}{\lambda_n} = \text{ bounded } \cdot \alpha_n = \bar{o}(1)$, $n \to \infty$. Therefore $e^{i \alpha_n arctg \frac{t}{\lambda_n} } = 1 + \bar{o}(1).$

Put $\phi_n(t) ={\Bigl( 1 + \frac{t^2}{\lambda_n^2}\Bigr)}^{-\frac{\alpha_n}2}$. We proved that \begin{gather} \Bigl( X_n \overset{w}{\to} \Bigr) \Longleftrightarrow \Bigl( \forall t \in \mathbb{R} \exists \lim_n \phi_n(t) = \phi(t) \text{ and } \phi(t) \text{ is continious at } 0 \Bigr). (1) \end{gather}

If $\alpha_n \to 0$ and $\lambda_n \ge c > 0$ then $\phi_n(t) \to 1$ and moreover $Ee^{itX_n} \to 1$, hence $X_n \overset{w}{\to} 0$. The second statement of the theorem is proved.

Here and everywhere below we will assume that $\lambda_n \to 0$. For $t \ne 0$ we have

\begin{gather} \phi_n(t) = e^{-\frac{\alpha_n}2 \ln {\Bigl( 1 + \frac{t^2}{\lambda_n^2}\Bigr)} } = e^{-\frac{\alpha_n}2 (1 + \bar{o}(1))\ln {\Bigl(\frac{t^2}{\lambda_n^2}\Bigr)} } = \nonumber \\ = e^{-\alpha_n (1 + \bar{o}(1)) ( \ln t - \ln \lambda_n )} = e^{\alpha_n \ln \lambda_n (1 + \bar{o}(1))} \end{gather}

Hence for $t \ne 0$ we have $$ \exists lim_n \phi_n(t) \Longleftrightarrow \exists lim_n \alpha_n \ln \lambda_n = A \in [-\infty, 0]$$ and if this condition holds true then $\phi(t) = lim_n \phi_n(t) = e^{A}$, $t \ne 0$. Obviously $lim_n \phi_n(0) = lim_n 1 = 1$.

So \begin{gather} \Bigl( \forall t \in \mathbb{R} \ \ \exists \lim_n \phi_n(t) = \phi(t) \text{ and } \phi(t) \text{ is continious at } 0 \Bigr) \Longleftrightarrow \nonumber \\ \exists lim_n (\alpha_n \ln \lambda_n) = A \text{ and } A=0. (2) \end{gather} From (1) and (2) we get that $X_n$ converges iff $ \lim_{n \to \infty} \lambda_n \ln \alpha_n = 0$. Moreover, if $ \lim_{n \to \infty} \lambda_n \ln \alpha_n = 0$ then $Ee^{itX_n} = \phi_n(t) \cdot e^{i \alpha_n arctg \frac{t}{\lambda_n} } \to 1 = Ee^{it\cdot 0}$, hence $X_n \overset{w}{\to} 0$.

Theorem is proved.

0
On

This comment is the second part of my previous comment, I write it separately for the convenience of typing.

What have we proved already? In case, when $\alpha_n \to 0$ or $\lambda_n \to 0$ or they both tend to zero, the answer is given by the previous theorem. Suppose now that $\alpha_n$ and $\lambda_n$ are separated from zero. A case when both of them are separated from $\infty$ was considered also --- in this case we proved that $ \Bigl( X_n \overset{w}{\to} \Bigr) \Longleftrightarrow \exists \lim_n \lambda_n$ and $\exists \lim_n \alpha_n$. Now we will deal with the last case, when $\alpha_n$ and $\lambda_n$ are separated from zero and at least one of them is not separated from $\infty$.

Theorem 2.

Suppose that $\alpha_n\ge c_1 > 0$ and $\lambda_n \ge c_1 > 0$.

If $\alpha_n \le c_2$ and $\lambda_n \to \infty$ then $X_n \overset{w}{\to} 0$.

If $\alpha_n \to \infty$ and $\lambda_n \le c_2$ then $X_n$ diverges.

If $\alpha_n \to \infty$ and $\lambda_n \to \infty$ then $$X_n \overset{w}{\to} \Longleftrightarrow \exists \lim_n \frac{\alpha_n}{\lambda_n},$$ and if $X_n$ is convergent then $X_n \overset{w}{\to} \lim_n \frac{\alpha_n}{\lambda_n}$.

Let us prove the theorem 2.

The first and the second statements may be proved similarly to the proof of Theorem 1.

Consider the most interesting case: $\alpha_n \to \infty$ and $\lambda_n \to \infty$. Here and everywhere below we will assume that this condition holds.

As above, we have $$Ee^{itX_n} = {\Bigl( 1 + \frac{t^2}{\lambda_n^2}\Bigr)}^{-\frac{\alpha_n}2} \cdot e^{i \alpha_n arctg \frac{t}{\lambda_n} } = e^{- \frac{\alpha_n}2 \cdot \frac{t^2}{\lambda_n^2}(1+\bar{o}(1))} \cdot e^{i t \frac{\alpha_n}{\lambda_n} (1+\bar{o}(1))}. $$

Suppose that $X_n$ is convergent. Hence $|Ee^{itX_n}|$ converges to a continuous function. But $|Ee^{itX_n}| = e^{- \frac{\alpha_n}2 \cdot \frac{t^2}{\lambda_n^2}(1+\bar{o}(1))} $ thus $ \frac{\alpha_n}{\lambda_n^2} \to A \in [0, \infty)$. Moreover, $Ee^{itX_n}$ converges to a continuous function. Hence $e^{i t \frac{\alpha_n}{\lambda_n} (1+\bar{o}(1))}$ converges to a continuous function. It means that $\xi_n = \frac{\alpha_n}{\lambda_n}$ converges weakly to some $\xi$. Put $F_{\xi_n}(x) = P(\xi_n \le x)$, $F_{\xi}(x) = P(\xi_n \le x)$. We know that $F_{\xi_n}(x) \to F_{\xi}(x) $ for all $x$ except at most countable set. Hence, $F_{\xi}(x)$ is a distribution function of some constant. It's easy to see that $\exists \lim_n \frac{\alpha_n}{\lambda_n} < \infty$ and this limit is equal to $\xi$. So we proved that

$$ \Bigl( X_n \overset{w}{\to} \Bigr) \Longrightarrow \Bigl( \exists \lim_n \frac{\alpha_n}{\lambda_n} \Bigr). $$

Now suppose that $\exists \lim_n \frac{\alpha_n}{\lambda_n} = B < \infty$. Thus $\exists \lim_n \frac{\alpha_n}{\lambda_n^2} = 0$. We know that $Ee^{itX_n} = e^{- \frac{\alpha_n}2 \cdot \frac{t^2}{\lambda_n^2}(1+\bar{o}(1))} \cdot e^{i t \frac{\alpha_n}{\lambda_n} (1+\bar{o}(1))}$ hence $Ee^{itX_n} = e^{i t B (1+\bar{o}(1))}$.

Thus $$\Bigl(\exists \lim_n \frac{\alpha_n}{\lambda_n} = B \Bigr) \Longrightarrow \Bigl(X_n \overset{w}{\to} B\Bigr),$$ q.e.d.

Finally, let us present an informal proof of the biggest part of the theorem 2: the case when $\alpha_n \to \infty$. It's well known that $\Gamma(\lambda, m)$ is a sum of $m$ independent exponential random variables $\exp{\lambda}$ with parameter $\lambda$. The following symbolic equality is fulfilled: $\exp{\lambda} = \frac{\exp{1}}{\lambda}$, it means that $\lambda$ is a scale parameter. Hence in case $\alpha_n \to \infty$. we have \begin{gather} X_n \overset{d}{=} \Gamma(\lambda_n, \alpha_n) \approx \sum_{i=1}^{[\alpha_n]} \exp{\lambda_n} = \sum_{i=1}^{[\alpha_n]} \frac{\exp{1}}{\lambda_n} \nonumber =\\ =\frac{\sum_{i=1}^{[\alpha_n]} \exp{1} }{[\alpha_n]} \cdot \frac{[\alpha_n]}{\lambda_n} = \text{S.L.L.N.} = (1+\bar{o}(1)) \cdot \frac{[\alpha_n]}{\lambda_n} \approx \frac{\alpha_n}{\lambda_n}. \nonumber \end{gather} Now it's easy to see that if $\lambda_n \le c_2$ then $X_n$ diverges and if $\lambda_n \to \infty$ then $\Bigl(X_n \overset{w}{\to}\Bigr)$ iff $\exists \lim_n \frac{\alpha_n}{\lambda_n}$. Moreover, if $X_n$ is convergent then $X_n \overset{w}{\to} \lim_n \frac{\alpha_n}{\lambda_n}$.

Now it's easy to see construct "bad" sequences $\alpha_n$ and $\lambda_n$ such that $X_n$ converges. For example, put

\begin{gather} \alpha_{3n} = \frac1{n}, \lambda_{3n} = 2+(-1)^n,\\ \alpha_{3n+1} = 2+(-1)^n, \lambda_{3n+1} = n,\\ \alpha_{3n+2} = \frac1{n}, \lambda_{3n+2} = \frac1{n}. \end{gather}

It follows from Theorem 1 and Theorem 2 that $X_n \overset{d}{\to} 0$.

16
On

Let $\{\mu_n\overset{\text{d}}{=}\Gamma(\alpha_n,\lambda_n), n\ge 1\}$ be a sequence of Gamma-distributions.
The necessary and sufficient conditions for weak convergence of Gamma-distribution could be expressed as follows:

  1. The necessary and sufficient conditions of $ \mu_n\Rightarrow\delta_0 $ (where $\delta_0$ is a single point probability distribution concentrated at $0$) are \begin{equation*} \lim_{n\to\infty}\alpha_n\log\Big(1+\frac1{\lambda_n}\Big)= 0. \tag{1} \end{equation*}

  2. The necessary and sufficient conditions of $ \mu_n\Rightarrow\delta_c$ (where $\delta_c$ is a single point probability distribution concentrated at $c>0$) are

\begin{equation*} \lim_{n\to\infty}\lambda_n=\infty, \qquad \lim_{n\to\infty}\frac{\alpha_n}{\lambda_n}= c. \tag{2} \end{equation*}

  1. The necessary and sufficient conditions of $ \mu_n\Rightarrow\mu\ne\delta_c(c\ge 0)$ are \begin{equation*} \lim_{n\to\infty}\alpha_n=\alpha\in(0,\infty),\qquad \lim_{n\to\infty}\lambda_n=\lambda\in(0,\infty). \tag{3} \end{equation*}

If $\mu$ is a probibilty distribution on $\mathbb{R}_+=[0,\infty)$, then define the Laplace transform of $ \mu $ as follows: $$ L(u)=\int_0^\infty e^{-ux}\mu(dx),\qquad u>0, $$ and $$ \psi(u) =-\log L(u). $$ (For the Laplace transform of distribuions please refer to the W. Feller, An introduction to Probability Theory and Its Applications(2nd Ed.), Vol.II John Wiley & Sons, Inc.(1971). Ch.13.) If $\mu=\Gamma(\alpha,\lambda)$ is a $\Gamma$-distribution, then it is a infinitively divisible distribution on $\mathbb{R}_+$, and its Laplace transform is $$ L(u)=\Big(1+\frac{u}{\lambda}\Big)^{-\alpha}. $$ and \begin{gather*} \psi(u)=-\log L(u)=\alpha\log\Big(1+\frac{u}{\lambda}\Big). \tag{4}\\ \alpha=\frac{\psi(u)}{\log\Big(1+\dfrac{u}{\lambda}\Big)}\qquad \lambda=\frac{u}{e^{\psi(u)/\alpha}-1}. \tag{5} \end{gather*} In particular, $$ \psi_{\delta_0}(u)=0,\qquad \psi_{\delta_c}(u)=cu. $$ For $\mu_n\overset{w}{\to}\mu^\ast$, iff the following limit is true(c.f. Feller's book), \begin{gather*} L_n(u)\to L^\ast(u),\qquad \forall u\ge0,\\ \begin{aligned} \psi_n(u)&=-\log L_n(u)=\alpha_n\log\Big(1+\frac{u}{\lambda_n}\Big)\\ &\to\psi^\ast(u)=\log L^\ast(u),\qquad \forall u\ge0. \end{aligned} \tag{6} \end{gather*} Using the similar method for characteristic functions we could also deduce the convergence in (6) is uniform at $ u=0 $, that is \begin{equation*} \lim_{n\to\infty}\psi_n(u_n)=0.\qquad \forall u_n\to0. \tag{7} \end{equation*} Firstly, to prove the sufficiency of (1), it is suffice to use following fact: \begin{align*} 0<c_1(u)&\le \inf_{\lambda>0}\frac{\log\Big(1+\dfrac{u}{\lambda}\Big)}{\log\Big(1+\dfrac1{\lambda}\Big)}\\ &\le \sup_{\lambda>0}\frac{\log\Big(1+\dfrac{u}{\lambda}\Big)}{\log\Big(1+\dfrac1{\lambda}\Big)} \le C_2(u)<\infty,\qquad \forall u>0. \end{align*}

The proofs of sufficiency of (2),(3) are direct.

Next the necessity discussed only. In the following we always assume $\mu_n\overset{w}{\to}\mu^\ast$ and (6) holds.

A. If $ \psi^\ast(u)\ne0 $, then \begin{equation*} \varliminf_{n\to\infty} \alpha_n>0,\qquad \varliminf_{n\to\infty} \lambda_n>0. \tag{8} \end{equation*}

\textit{Proof}: Prove (8) using contradictory. If (8) is not true, then there exists a subsequence $ \alpha_{n_k}(\text{or }\lambda_{n_k})\to 0 $. From (6), \begin{gather*} \lambda_{n_k}=\frac{\psi_{n_k}(u)}{e^{\psi_{n_k}(u)/\alpha_{n_k}}-1}\to0.\\ \Big(\text{or } \alpha_{n_k}=\frac{\psi_{n_k}(u)}{\log\Big(1+\dfrac{u}{\lambda_{n_k}}\Big)}\to0.\Big) \end{gather*} Hence both $\alpha_{n_k}\to 0$, $\lambda_{n_k}\to 0$. If $\psi^\ast(a)\ne 0$, \begin{equation*} \lim_{k\to\infty} \alpha_{n_k}\log\Big(1+\frac{a}{\lambda_{n_k}}\Big) =\lim_{k\to\infty} \psi_{n_k}(a)=\psi^{\ast}(a)\ne0. \tag{9} \end{equation*}
Now take $ u_{n_k}=\sqrt{\lambda_{n_k}^2+a\lambda_{n_k}}-\lambda_{n_k} $, then $ u_{n_k}\to0 $ and \begin{align*} \psi_{n_k}(u_{n_k})&=\alpha_{n_k}\log\Big(1+\frac{u_{n_k}}{\lambda_{n_k}}\Big)\\ &=\frac12\alpha_{n_k}\log\Big(1+\frac{a}{\lambda_{n_k}}\Big) \to \frac12\psi^\ast(a)\ne0 \tag{10} \end{align*}
(10) contradicts with (7). Hence (8) holds.

B. If $ \psi^\ast(u)\ne0 $ and there exists a subsequence $\alpha_{n_k}\to\infty$(or $ \lambda_{n_k}\to\infty $ ), then \begin{gather*} \lim_{n\to\infty}\alpha_n=\lim_{n\to\infty}\lambda_n=\infty,\\ \lim_{n\to\infty}\frac{\alpha_n}{\lambda_n}=c, \qquad \psi^\ast(u)=cu. \end{gather*}

\textit{Proof}: From (6), if $\alpha_{n_k}\to\infty$(or $ \lambda_{n_k}\to\infty $ ), then \begin{equation*} \lambda_{n_k}=\frac{\psi_{n_k}(u)}{e^{\psi_{n_k}(u)/\alpha_{n_k}}-1}\to\infty,\quad \Big(\text{or } \alpha_{n_k}=\frac{\psi_{n_k}(u)}{\log\Big(1+\dfrac{u}{\lambda_{n_k}}\Big)} \to\infty\Big) \end{equation*} and \begin{align*} \lim_{k\to\infty}\frac{\alpha_{n_k}}{\lambda_{n_k}} &=\lim_{k\to\infty}\frac{\psi_{n_k}(u)}{\lambda_{n_k}\log\Big(1+\dfrac{u}{\lambda_{n_k}}\Big)}\\ &=\frac{\psi^\ast(u)}{u}\ne0, \end{align*} $$ \psi^\ast(u)=cu.$$ In this case, if there exists a convergent subsequence $ \{\alpha_{n_k},k\ge 1\} $ (or $\{\lambda_{n_k},k\ge 1\}$), but not converge to $ \infty $, then the $\psi_{n_k}(u)\to\psi^{\ast\ast}(u)\ne cu=\psi^\ast(u) $, so doesn't exist any subsequence $ \{\alpha_{n_k},k\ge 1\} $ (nor $\{\lambda_{n_k},k\ge 1\} $) which converge to a finite limit. B holds.

C. If $ \psi^\ast(u)\ne cu(c\ge 0) $, then \begin{gather*} \lim_{n\to\infty}\alpha_n=\alpha^\ast\in(0,\infty),\\ \lim_{n\to\infty}\lambda_n=\lambda^\ast\in(0,\infty).\\ \psi^\ast(u)=\alpha^\ast\log\Big(1+\frac{u}{\lambda^\ast}\Big). \end{gather*}

\textit{Proof}: From {\bfseries B, C}, if $ \psi^\ast(u)\ne cu(c\ge 0) $, then \begin{gather*} 0<\varliminf_{n\to\infty} \alpha_n\le \varlimsup_{n\to\infty} \alpha_n<\infty,\\ 0<\varliminf_{n\to\infty} \lambda_n\le \varlimsup_{n\to\infty} \lambda_n<\infty. \end{gather*} This means that $S=\{(\alpha_n,\lambda_n), n\ge 1\} $ is a relative compact set in $(0,+\infty)\times(0,+\infty)$(The closure $\overline{S}\subset(0,+\infty)\times(0,+\infty)$ and $\overline{S}$ is compact). Now we prove that $ A $ has unique limit point. Suppose that $\{(\alpha_{n_k},\lambda_{n_k}), k\ge 1\} $ is a convergent subsequence in $ S $ and \begin{align*} \alpha^\ast=\lim_{k\to\infty}\alpha_{n_k},\qquad \lambda^\ast=\lim_{k\to\infty}\lambda_{n_k} \end{align*} then $\mu^*\overset{\text{d}}{=}\lim\limits_{k\to\infty}\mu_{n_k}$ is a $ \Gamma(\alpha^\ast, \lambda^\ast) $-distribution and \begin{equation*} \alpha^\ast=\frac{(\mathsf{E}_{\mu^\ast}[X])^2}{\mathsf{V}_{\mu^\ast}[X]},\quad \lambda^\ast=\frac{\mathsf{E}_{\mu^\ast}[X]}{\mathsf{V}_{\mu^\ast}[X]}. \end{equation*} This also means that for $ S $ there exits unique limits point $(\alpha^\ast,\lambda^\ast)$ and $ \mu^\ast\overset{\text{d}}{=}\Gamma(\alpha^\ast,\lambda^\ast) $. C holds.

From above facts, it is easy to get the necessary conditions for weak convergence of Gamma-distribution.

0
On

In this comment I will give a full answer to the question about convergence of $X_n$.

We will say that the sequence $(\alpha_n, \lambda_n)$ is "nice" if it has the next property. If for some subsequence $\alpha_{n_k}$ we have $\alpha_{n_k} \to c \in [0, \infty]$ then:

  1. if $c=0$ then $\liminf_{k} \lambda_{n_k}^{\alpha_{n_k}} \ge 1$
  2. if $c>0$ then $\lim_{k} \frac{\alpha_{n_k}}{\lambda_{n_k}}=0$.

Theorem 3: If $X_n \overset{d}{\to} X$ then $X$ may be $constant \ge 0$ or $X \sim \Gamma(\lambda, \alpha)$.

$$\Bigl( X_n \overset{d}{\to} 0 \Bigr) \Longleftrightarrow \Bigl( (\alpha_n, \lambda_n) \text{ is "nice"} \Bigr).$$

$$\Bigl(X_n \overset{d}{\to} X = C = const>0 \Bigr) \Longleftrightarrow \Bigl( \alpha_n \to \infty \text{ and } \frac{\alpha_n}{\lambda_n} \to C \in (0, \infty) \Bigr)$$

\begin{gather} \Bigl(X_n \overset{d}{\to} X \sim \Gamma(\lambda, \alpha) \Bigr) \Longleftrightarrow \Bigl( \lim_n \alpha_n = \alpha \in (0, \infty) \text{ and } \lim_n \alpha_n = \lambda \in (0, \infty) \Bigr)\nonumber \end{gather}

Corollary 1.

$X_n$ is convergent iff one of the following conditions holds true:

  1. $(\alpha_n, \lambda_n)$ is "nice"
  2. $\Bigl( \alpha_n \to \infty \text{ and } \frac{\alpha_n}{\lambda_n} \to C \in (0, \infty) \Bigr)$
  3. $\lim_n \alpha_n = \alpha \in (0, \infty) \text{ and } \lim_n \alpha_n = \lambda \in (0, \infty)$.

Proof.

At first let us present results of my $2$ previous comments in a concise way.

If $\alpha_n \ge c > 0$ and $\lambda_n \to 0$ then $X_n$ diverges.

If $\alpha_n \to 0$ and $\lambda_n \ge c > 0$ then $X_n \overset{w}{\to} 0$.

If $\alpha_n \to 0$ and $\lambda_n \to 0$ then

$$X_n \overset{w}{\to} \Longleftrightarrow X_n \overset{w}{\to} 0 \Longleftrightarrow \Bigl(\lim_{n \to \infty} \alpha_n \ln \lambda_n = 0.\Bigr)$$

If for some $0<c_1 \le c_2 < \infty$ we have $\alpha_n, \lambda_n \in [c_1, c_2]$ then

$ \Bigl( X_n \overset{w}{\to} \Bigr) \Longleftrightarrow \Bigl(\exists \lim_n \lambda_n$ and $\exists \lim_n \alpha_n \Bigr)$. In this case $X_n \overset{w}{\to} \Gamma(\lim_n \lambda_n, \lim_n \alpha_n)$.

If $\alpha_n \le c_2$ and $\lambda_n \to \infty$ then $X_n \overset{w}{\to} 0$.

If $\alpha_n \to \infty$ and $\lambda_n \le c_2$ then $X_n$ diverges.

If $\alpha_n \to \infty$ and $\lambda_n \to \infty$ then $$X_n \overset{w}{\to} \Longleftrightarrow \exists \lim_n \frac{\alpha_n}{\lambda_n},$$ and if $X_n$ is convergent then $X_n \overset{w}{\to} \lim_n \frac{\alpha_n}{\lambda_n}$.

It's easy to see that if $X_n \overset{d}{\to} X$ then $X$ may be $constant \ge 0$ or $X \sim \Gamma(\lambda, \alpha)$.

Moreover, it follows immediately that $$\Bigl(X_n \overset{d}{\to} X = C = const>0 \Bigr) \Longleftrightarrow \Bigl( \alpha_n \to \infty \text{ and } \frac{\alpha_n}{\lambda_n} \to C \Bigr)$$ and \begin{gather} \Bigl(X_n \overset{d}{\to} X \sim \Gamma(\lambda, \alpha) \Bigr) \Longleftrightarrow \Bigl( \lim_n \alpha_n = \alpha \in (0, \infty) \text{ and } \lim_n \alpha_n = \lambda \in (0, \infty) \Bigr)\nonumber \end{gather} These conditions were mentioned here by JGWang.

The most interesting case is case when $X_n \overset{d}{\to} 0$.

We got convergence $X_n \overset{d}{\to} 0$ only in these cases:

  1. $\alpha_n \to 0$ and $\lambda_n \ge c > 0$,

  2. $\alpha_n \to 0$, $\lambda_n \to 0$ and $\lim_{n \to \infty} \alpha_n \ln \lambda_n = 0$,

  3. $\alpha_n \le c_2$, $\lambda_n \to \infty$,

  4. $\alpha_n \to \infty$, $\lambda_n \to \infty$ and $\lim_n \frac{\alpha_n}{\lambda_n}=0$.

These cases can be summarized as follows:

  1. $\alpha_n \to 0$ and $\Bigl( \limsup_n (-\alpha_n \ln \lambda_n) \le 0 \Longleftrightarrow \liminf_n \lambda_n^{\alpha_n} \ge 1 \Bigr)$
  2. $0 \le c_1 \le \alpha \le c_2 < \infty$ and $\lambda_n \to \infty$
  3. $\alpha_n \to \infty$ and $\lim_n \frac{\alpha_n}{\lambda_n}=0$.

Hence $X_n \overset{d}{\to} 0$ iff $(\alpha_n, \lambda_n)$ is "nice", q.e.d.