I'm interested in the convergence of a stochastic process. Let $(X_i)_{i \geq 1}$ be i.i.d. random variables with mean $\mu$ and variance $\sigma^2$. Furthermore, assume $0 < t < T$ for some fixed $T$. Now regard the process $$\left(\frac{1}{\sqrt{n}}\sum\limits_{i=1}^{nt}{\left(\frac{\sum\limits_{i=1}^{nT}{X_i}}{nT}\right)^2}\right)_t=\left( \sqrt{n}t \left(\frac{\sum\limits_{i=1}^{nT}{X_i}}{nT}\right)^2\right)_{t} = \left(\frac{t}{T}\frac{1}{\sqrt{n}}\sum\limits_{i=1}^{nT}{X_i} \frac{\sum\limits_{i=1}^{nT}{X_i}}{nT}\right)_t.$$ How does this process behave asymptotically in $t$ for $n \to \infty$? Does it converge to a Brownian motion with variance depending on $\frac{t}{T}$ and a drift towards $\sqrt{n}t \mu^2$?
Many thanks!