Let $f(t,X) = \log (1+t^2 X)/t$ and let $G_n(t) = n^{-1} \sum_{i=1}^n f(t,X_i)$
Define $T_n$ as the point that maximizes $G_n(t)$ and let $T_\infty$ be defined as the value that maximises the non-random function $g(t) = EG_n (t)$.
My goal is to find the asymptotic distribution of $\sqrt{n}(T_n - T_\infty)$.
I have successfully shown:
$T_n \to T_\infty$ in probability
$G(T_n) \to g(T_\infty)$ in probability
$\sqrt{n}(G(T_n) - G_n(T_\infty)) \to N(0,\sigma(t))$
Derived the expression of $\sigma (t)$.
However, the last step, getting from the aymptotic distribution of $(3)$ to the desired result is hindered by the fact $f(t,X)$ and $G_n(t)$ are not invertible, and so the $\delta$-method cannot be used.
Does anyone see a way forward?
Edit:
Looking at this further I think the trick is to use a Taylor expansion, so I''m going down that route for now!
I suggest to mimic derivation of Maximum Likelihood Estimator: The First-Order Condition is given by $$ G_n'(T_n) =\frac{1}{n}\sum_{i\leq n}f_1(T_n,X_i) = 0 \cdots(*). $$($f_1$ is partial derivative with respect to $t$.) What we know is that, roughly speaking, under sufficient regularity of $f$, it holds that $$S_n := \frac{1}{n}\sum_{i\leq n}f_1(T_\infty,X_i)\to E[f_1(T_\infty,X_i)] =g'(T_\infty)=0, $$ almost surely. Moreover, we know from CLT the limiting distribution of $S$: $$\sqrt{n}S_n \to N(0,\sigma^2),$$ where $$\sigma^2 = var(f_1(T_\infty,X_1)) = E[f_1(T_\infty,X_1)^2]. $$From $(*)$, we know $$\sqrt{n}S_n = \sqrt{n}(S_n - G_n'(T_n)) = \frac{1}{\sqrt{n}}\sum_{i\leq n}(f_1(T_\infty,X_i)-f_1(T_n,X_i))=\frac{1}{n}\sum_{i\leq n}f_{11}(sT_\infty+(1-s)T_n,X_i)\cdot\sqrt{n}(T_\infty-T_n),$$ where in the last equation, we used mean value theorem($0<s<1$). Under sufficient condition, we have $$\frac{1}{n}\sum_{i\leq n}f_{11}(sT_\infty+(1-s)T_n,X_i) \to E[f_{11}(T_\infty,X_1)]:= \tau \neq 0 $$(being non-zero is assumed.) Finally, Slutsky's theorem gives us the result: $$\sqrt{n}(T_n-T_\infty) \to N(0, \frac{\sigma^2}{\tau^2}). $$