Let $(X_1,...,X_n,...)$ be Cauchy variables. Let $(Y_{1,n},...Y_{n,n})$ be the same variables as $(X_1,...X_n)$, ordered by decreasing absolute value.
The sum $\sum_{i=1}^n Y_{i,n}/n $ converges in law to a Cauchy variable, and in fact looks mainly determined by its first terms. My question is "how far precisely should I go into the sum to approximate it properly?". More rigorously, I'm searching for $f$ such that $\sum_{i=f(n)}^n Y_{i,n}/n \to 0 $, with at least $f(n)/n\to 0$ (if it exists). Ideally, I'd like $f$ to be precise enough so that if $g=o(f)$ then the result does not hold for $g$.
My bet is that any $n^\alpha$, $1/2<\alpha<1$, should do it. But I'm unable to prove it for even one exponent...
My heuristic is as follow: for $Y_{n^\alpha,n}$ to be bigger than $k$, there must be $n^\alpha$ variables bigger than $k$. From this, we deduce that $\mathbb{E}[Y_{n^\alpha,n}^2]$ is asymptotically bounded by $C n^\beta$ if $1-\alpha<\beta<1$.
Using the fact that the signs of the $Y_i, i>f(n)$ are random, plus the bound, we see that $\sum_{i=f(n)}^n Y_{i,n}/n$ is approximately bounded by some gaussian, those variance goes to $0$ iif $\beta<1/2$. It's heuristic since we cannot really apply TCL here (partly because the variables are not independant, also because laws varies with $n$...).
Thanks a lot, have a good day!
Edit for anyone interested: If I'm not mistaking it's in fact true for every $0<\alpha<1$. After the estimate, we only need to use Markov inequality on $\sum_{i=n^\alpha}^n Y_{i,n}^2$ (using the fact known fact that $Y_{i,n}$ admits a variance for $i\geq 5$), and remark that, thanks to sign independance, $\mathbb{E}[Y_{i,n}Y_{j,n}]=0$ as soon as $i\not=j$.