Let $X_1,\dots,X_n$ are i.i.d with distribution function $F$. Let $\hat F_n$ be its empirical distribution function, i.e., $$ \hat F_n(x)=\frac1n\sum_{i=1}^n1_{\{X_\le x\}}(x) $$ where $1_A(x)$ is the characteristic function.
I am going to know the order of $$ T_n=\sum_{i=1}^n\left(\{\Phi^{-1}(\hat F_n(X_i))\}^2-\{\Phi^{-1}(F(X_i))\}^2\right) $$ where $\Phi$ is the comulative normal standard distribution function.
I have feeling that $T_n=O_p(\sqrt n)$ since for all $x\in\mathbb{R}$ I have $\{\Phi^{-1}(\hat F_n(x))\}^2-\{\Phi^{-1}(F(x))\}^2=O_p(1/\sqrt n)$. However, I cannot extend to determine the order of $T_n$ since $\hat F_n$ also depends on $X_i$. Could anyone help me?
A problem with the question is that $\Phi^{-1}(\hat F_n(X_i))=+\infty$ for one index $i$ since $\hat F_n(X_i)=1$ when $X_i$ is the maximum of the sample. Thus, we assume that the sum defining $T_n$ avoids the maximum of the sample. We also assume, for simplicity, that $F$ is continuous. Then there is no ties and, by a standard result, $(F(X_i))$ is distributed like an i.i.d. sample $(U_i)$ uniform on $(0,1)$. Let $(U_{(i)})$ denote the ordered sample and $K=(\Phi^{-1})^2$, then $$ T_n=\sum_{i=1}^{n-1}K(i/n)-K(U_{(i)}). $$ Let us recall some known facts about $(U_{(i)})$. First, for every $i\leqslant j$, $$ E(U_{(i)})=\frac{i}{n+1},\qquad\mathrm{Cov}(U_{(i)},U_{(j)})=\frac{i(n+1-j)}{(n+1)^2(n+2)}. $$ Second, when $n\to\infty$, the family $(U_{(i)})$ is approximately a normal vector hence, by the delta-method, $T_n$ is approximately normal, centered, with variance $$ s^2_n=\mathrm{Var}\left(\sum_{i=1}^{n-1}K'(i/n)(U_{(i)}-E(U_{(i)}))\right). $$ Note that, $$ s^2_n=\sum_{i,j}K'(i/n)K'(j/n)\mathrm{Cov}(U_{(i)},U_{(j)}), $$ and that, when $n\to\infty$, $i/n\to x$ and $j/n\to y$, $$ \lim\limits_{n\to\infty}\mathrm{Cov}(U_{(i)},U_{(j)})=\frac{R(x,y)}n, $$ where $$ R(x,y)=\min\{x,y\}\cdot(1-\max\{x,y\}). $$ This indicates that $s_n$ corresponds to Riemann sums of some function on $[0,1]^2$, hence $s_n^2/n\to\sigma^2$, with $$ \sigma^2=\iint_{[0,1]^2}K'(x)K'(y)R(x,y)\mathrm dx\mathrm dy. $$ Since $\sigma^2\ne0$, this shows that $T_n/\sqrt{n}$ converges in distribution to a centered normal distribution with variance $\sigma^2$.
To identify $\sigma^2$, note that $K(\Phi(u))=u^2$ hence $K'(\Phi(u))\varphi(u)=2u$ and, by the change of variable $x=\Phi(u)$, $y=\Phi(v)$, $$ \sigma^2=\iint_{\mathbb R^2}K'(\Phi(u))K'(\Phi(v))R(\Phi(u),\Phi(v))\varphi(u)\varphi(v)\mathrm du\mathrm dv, $$ thus, $$ \sigma^2=\iint_{\mathbb R^2}4uvR(\Phi(u),\Phi(v))\mathrm du\mathrm dv. $$ Using Fubini theorem and the identity $$ R(\Phi(u),\Phi(v))=\int_{-\infty}^{\min\{u,v\}}\varphi(s)\int_{\max\{u,v\}}^{\infty}\varphi(t)\mathrm dt\mathrm ds, $$ one sees that $$ \sigma^2=\iint I(s,t)\varphi(s)\varphi(t)\mathrm ds\mathrm dt, $$ where $$ I(s,t)=\mathbf 1_{s\lt t}\iint 4uv\mathbf 1_{s\lt\min\{u,v\},\max\{u,v\}\lt t}\mathrm dv\mathrm du, $$ that is, $$ I(s,t)=\mathbf 1_{s\lt t}\cdot\int_s^t2u\mathrm du\cdot\int_s^t2v\mathrm dv=\mathbf 1_{s\lt t}\cdot(t^2-s^2)^2. $$ Thus, $$ 2\sigma^2=\iint (t^2-s^2)^2\varphi(s)\varphi(t)\mathrm ds\mathrm dt=E((\xi^2-\eta^2)^2), $$ where $(\xi,\eta)$ is i.i.d. standard normal. One knows that $E(\xi^2)=E(\eta^2)=1$ and $E(\xi^4)=E(\eta^4)=3$, hence $$ 2\sigma^2=3-2\cdot1\cdot1+3=4, $$ and finally, $$ T_n/\sqrt{n}\stackrel{\mathrm{dist.}}{\longrightarrow}\mathcal N(0,2). $$