If double-sided exponential distribution has density $$ g(x) = \frac{\lambda}{2}e^{-\lambda|x|} $$ and we have a sequence of independent r.v.s $X_n$ where each of them has double-sided exponential distribution with parameter $\lambda_n = \frac{1}{\sqrt{n}}$, prove that there exist sequences $a_n, b_n$ such that $$ \frac{\left(\sum_\limits{1\leq k \leq n}X_k\right)-a_n}{b_n} \text { converges weakly to } N(0,1) $$
I know this is a Lindeberg's CLT type of task but I was unable to proceed any further than finding out that $\mathbb{E}X^2 = \frac{2}{\lambda^2}$ and$\mathbb{E}X = \frac{2}{\lambda}$. If we took $X_n$s with $\lambda=\frac{1}{\sqrt{n}}$ that would make $\sum \mathbb{E} X_n \sim cn$ for some $c \in \mathbb{R}$ and $\sum \mathbb{E} X_n^2 \sim d n^3$ for some $d \in \mathbb{R}$.
Could anyone solve this example? I tried looking for similar example but there aren't many examples of such CLT on math.stackexchange.