I am struggling with the following problem.
Let $X$ be a real valued random variable whose distribution is absolutely continuous with regard to the Lebesgue measure. Denote $F$ and $f$ the cumulative distribution function and density of $X$ respectively. Suppose $F$ is strictly increasing. Let $(X_1,\dots,X_n)$ an i.i.d. sample such that $X_i \sim X$ for all $i$.
Let: \begin{equation*} F_n(x) = n^{-1}\sum_{i=1}^n \boldsymbol{1}_{\{X_i \leq x\}} \end{equation*} be the associated empirical c.d.f. for $x\in \mathbb{R}$, and \begin{equation*} F_n^{-1}(\alpha) = \inf\{u \in \mathbb{R}|F_n(u)\geq \alpha\} \end{equation*} be the generalized inverse of the empirical c.d.f./empirical quantile function for $\alpha \in [0,1]$.
It is well known that $\sqrt{n}(F_n^{-1}(\alpha)-F^{-1}(\alpha))$ converges in distribution to a centered gaussian distribution with variance $\sigma^2 = \frac{\alpha (1-\alpha)}{f\circ F^{-1}(\alpha)}$.
I am interested in the asymptotic distribution of: \begin{equation*} T_n = \sqrt{n}(F_n^{-1}(\alpha_n)-F^{-1}(\alpha)) \end{equation*} where $\alpha_n = \alpha + \mathcal{N}\left(0,\frac{\alpha (1-\alpha)}{n}\right)$. It seems that it converges to $\mathcal{N}(0,2\sigma^2)$ using numerical simulations. Breaking $T_n$ into two parts by adding and subtracting $F^{-1}(\alpha_n)$ allows one to notice that $T_n = A_n+B_n$ where both $A_n$ and $B_n$ converges to $\mathcal{N}(0,\sigma^2)$, but since they are not independent I can not conclude that the sum converges to the sum of the limiting distributions.
Any help would be greatly appreciated! Thanks