Let $(X_n)_n$ be a sequence of random variable such that $f_{X_n}(x)=\frac{1}{\Gamma(\alpha_n)}\lambda_n^{\alpha_n}x^{\alpha_n-1}e^{-\lambda_nx}1_{]0,+\infty[}(x),$ where $\alpha_n>0,\lambda_n>0.$
Suppose that $\alpha_n=1,\forall n \in \mathbb{N}.$ Find a necessary and sufficient condition on $(\lambda_n)_n$ such that $(X_n)_n$ converges in distribution.
More generally, find a necessary and sufficient condition on $\alpha_n,\lambda_n$ so that $(X_n)_n$ converges in distribution.
The first part is easy, it converges in distribution if and only if $0<\liminf_n\lambda_n=\limsup_n\lambda_n$.
Concerning part 2), is it true that a condition of weak convergence is the convergence of $(\alpha_n)$ and $(\lambda_n)$?
"Is it true that a condition of weak convergence is the convergence of $\alpha_n$ and $\lambda_n$?" The answer is negative.
Counterexample:
Suppose $\lambda_n \to +\infty$, $\alpha_n = \bar{o}(\lambda_n)$, $n \to \infty$, and let $X_n$ have distribution $\Gamma(\lambda_n, \alpha_n)$. We have
$$\mathbb{E} e^{i t X_n} = \Bigl(1 - \frac{it}{\lambda_n} \Bigr)^{-\alpha_n} = e^{-\frac{\alpha_n}{\lambda_n} \cdot \lambda_n \ln\bigl( 1 - \frac{it}{\lambda_n} \bigr)}.$$
Hence,
$$\lim_{n \to \infty} \mathbb{E} e^{i t X_n} = \lim_{n \to \infty} e^{-\bar{o}(1) (-it)(1+\bar{o}(1))} = 1 = \mathbb{E} e^{i t \cdot 0}, $$ that is $X_n \to 0$ in distribution.
So we may take, for example, $\lambda_n = n$ and $\alpha_n = 2 + (-1)^n$ for our counterexample.
If we suppose that $\alpha_n$ and $\lambda_n$ are separated from zero and $\infty$, that is $0 < c \le \alpha_n, \lambda_n \le C < \infty$, than convergence of $X_n$ is equaivalent to pointwise convergence of functions $\Bigl(1 - \frac{it}{\lambda_n} \Bigr)^{-\alpha_n} $ for each $t \in \mathbb{R}$. In this case we may show that "there is convergence in distribution" iff "$\alpha_n$ and $\lambda_n$ are convergent" (and in this case $lim_n X_n \sim \Gamma(\lim_n \lambda_n, \lim_n \alpha_n)$). The "if-part" follows immediately from the convergence of characteristic functions.
To prove the second part, let us take a convergent subsequence $\alpha_{n_k}$. We know that $X_n$ has a limit, hence $\exists \lim_n \Bigl(1 - \frac{it}{\lambda_n} \Bigr)^{\alpha_n}$. Put $t=1$. Sequences $\Bigl(1 - \frac{i}{\lambda_{n_k}} \Bigr)^{\alpha_{n_k}}$ and $\alpha_{n_k}$ are convergent. Hence, for some finite $\lambda, \alpha>0$ we have $\lambda_{n_k} \to \lambda$, $\alpha_{n_k} \to \alpha$ and hence $$\lim_n \Bigl(1 - \frac{it}{\lambda_n} \Bigr)^{\alpha_n} \to \Bigl(1 - \frac{it}{\lambda} \Bigr)^{\alpha}.$$
If there is another convergent subsequence $\alpha_{n_k^*} \to \alpha^*$, we have (as above): $\lambda_{n_k^*} \to \lambda^*$ and $$\lim_n \Bigl(1 - \frac{it}{\lambda_n} \Bigr)^{\alpha_n} \to \Bigl(1 - \frac{it}{\lambda^*} \Bigr)^{\alpha^*}.$$ Moreover, $\alpha^*, \lambda^* > 0$. We need to prove that $\alpha = \alpha^*$ and $\lambda = \lambda^*$.
It's easy to see that
$$\Bigl(1 - \frac{it}{\lambda} \Bigr)^{\alpha} = \Bigl(1 - \frac{it}{\lambda^*} \Bigr)^{\alpha^*}$$ for all $t \in \mathbb{R}$. Put $C = \frac{\alpha}{\alpha^*} > 0$. We have $\Bigl(1 - \frac{it}{\lambda} \Bigr)^{C} = 1 - \frac{it}{\lambda^*} $. Put $u(t) = 1 - \frac{it}{\lambda} $. We have: $u^C = $ linear function of $u$, $u \in \mathbb{R}$. Hence, $C=1$ and $\alpha = \alpha^*$.
Moreover, $\Bigl(1 - \frac{it}{\lambda} \Bigr)^{C} = 1 - \frac{it}{\lambda^*} $, so $\lambda = \lambda^*$. The proof is finished.
For case when $\alpha_n$ and $\lambda_n$ are not separated from $0$ both we have the following theorem.
Theorem 1.
If $\alpha_n \ge c > 0$ and $\lambda_n \to 0$ then $X_n$ diverges.
If $\alpha_n \to 0$ and $\lambda_n \ge c > 0$ then $X_n \overset{w}{\to} 0$.
If $\alpha_n \to 0$ and $\lambda_n \to 0$ then
$$X_n \overset{w}{\to} \Longleftrightarrow X_n \overset{w}{\to} 0 \Longleftrightarrow \lim_{n \to \infty} \alpha_n \ln \lambda_n = 0.$$
Let us prove the theorem. If $\xi \sim \Gamma( \lambda, \alpha)$ then \begin{gather} \Bigl( 1 - \frac{it}{\lambda} \Bigr)^{-\alpha} = e^{-\alpha \ln(1 - \frac{it}{\lambda} )} = e^{-\alpha \Bigl( \ln \sqrt{1^2 + \frac{t^2}{\lambda^2}} + i \cdot arg(-\frac{t}{\lambda}) \Bigr) } = \nonumber \\ = e^{ - \frac{\alpha}2 \ln \bigl(1 + \frac{t^2}{\lambda^2} \bigr)} \cdot e^{i \alpha arctg \frac{t}{\lambda} } = {\Bigl( 1 + \frac{t^2}{\lambda^2}\Bigr)}^{-\frac{\alpha}2} \cdot e^{i \alpha arctg \frac{t}{\lambda} }.\nonumber \end{gather} Hence $$Ee^{itX_n} = {\Bigl( 1 + \frac{t^2}{\lambda_n^2}\Bigr)}^{-\frac{\alpha_n}2} \cdot e^{i \alpha_n arctg \frac{t}{\lambda_n} }.$$
From Levy’s continuity theorem we have $$ \Bigl( X_n \overset{w}{\to} \Bigr) \Longleftrightarrow \Bigl( \forall t \in \mathbb{R} \ \ \exists \lim_n Ee^{itX_n} = \phi(t) \text{and } \phi(t) \text{ is continious at } 0 \Bigr).$$
If $\alpha_n \ge c > 0$ and $\lambda_n \to 0$ then $\lim_n |Ee^{itX_n}| = {\Bigl( 1 + \frac{t^2}{\lambda_n^2}\Bigr)}^{-\frac{\alpha_n}2} \to I_{t=0}$, where $I$ is an indicator function. If $X_n \overset{w}{\to} X$ then $\lim_n |Ee^{itX_n}| = |Ee^{itX}|$ is continuous (as it is the module of the characteristic function). But $I_{t=0}$ is not continuous. Hence $X_n$ diverges. The first statement of the theorem is proved.
Here and everywhere below we will assume that $\alpha_n \to 0$. Thus $\alpha_n arctg \frac{t}{\lambda_n} = \text{ bounded } \cdot \alpha_n = \bar{o}(1)$, $n \to \infty$. Therefore $e^{i \alpha_n arctg \frac{t}{\lambda_n} } = 1 + \bar{o}(1).$
Put $\phi_n(t) ={\Bigl( 1 + \frac{t^2}{\lambda_n^2}\Bigr)}^{-\frac{\alpha_n}2}$. We proved that \begin{gather} \Bigl( X_n \overset{w}{\to} \Bigr) \Longleftrightarrow \Bigl( \forall t \in \mathbb{R} \exists \lim_n \phi_n(t) = \phi(t) \text{ and } \phi(t) \text{ is continious at } 0 \Bigr). (1) \end{gather}
If $\alpha_n \to 0$ and $\lambda_n \ge c > 0$ then $\phi_n(t) \to 1$ and moreover $Ee^{itX_n} \to 1$, hence $X_n \overset{w}{\to} 0$. The second statement of the theorem is proved.
Here and everywhere below we will assume that $\lambda_n \to 0$. For $t \ne 0$ we have
\begin{gather} \phi_n(t) = e^{-\frac{\alpha_n}2 \ln {\Bigl( 1 + \frac{t^2}{\lambda_n^2}\Bigr)} } = e^{-\frac{\alpha_n}2 (1 + \bar{o}(1))\ln {\Bigl(\frac{t^2}{\lambda_n^2}\Bigr)} } = \nonumber \\ = e^{-\alpha_n (1 + \bar{o}(1)) ( \ln t - \ln \lambda_n )} = e^{\alpha_n \ln \lambda_n (1 + \bar{o}(1))} \end{gather}
Hence for $t \ne 0$ we have $$ \exists lim_n \phi_n(t) \Longleftrightarrow \exists lim_n \alpha_n \ln \lambda_n = A \in [-\infty, 0]$$ and if this condition holds true then $\phi(t) = lim_n \phi_n(t) = e^{A}$, $t \ne 0$. Obviously $lim_n \phi_n(0) = lim_n 1 = 1$.
So \begin{gather} \Bigl( \forall t \in \mathbb{R} \ \ \exists \lim_n \phi_n(t) = \phi(t) \text{ and } \phi(t) \text{ is continious at } 0 \Bigr) \Longleftrightarrow \nonumber \\ \exists lim_n (\alpha_n \ln \lambda_n) = A \text{ and } A=0. (2) \end{gather} From (1) and (2) we get that $X_n$ converges iff $ \lim_{n \to \infty} \lambda_n \ln \alpha_n = 0$. Moreover, if $ \lim_{n \to \infty} \lambda_n \ln \alpha_n = 0$ then $Ee^{itX_n} = \phi_n(t) \cdot e^{i \alpha_n arctg \frac{t}{\lambda_n} } \to 1 = Ee^{it\cdot 0}$, hence $X_n \overset{w}{\to} 0$.
Theorem is proved.