I'm trying to show the following. i.e. the convergence of sample mean absolute deviation.
Let $X_1, \ldots X_n$ be order statistics from $N(\mu, \sigma^2)$.
$\hat{\xi} \equiv \frac{1}{n} \sum_{i=1}^{n} |X_i - \hat{\mu}|$, where $\hat{\mu} = X_{n/2}$ (sample median).
I would like to show that $\hat{\xi}$ can be expressed in another form, that is,
$\hat{\xi} \equiv \frac{1}{n} \sum_{i=1}^{n} |X_i - \mu| + R(\mu, \hat{\mu})$
where $R(\mu, \hat{\mu}) \underset{n \to \infty}{\to} 0$
What I tried:
Let $r_n$ such that $X_{r_n} < \hat{\mu} < X_{r_{n+1}}$, and $s_n$ such that $X_{s_n} < \mu < X_{s_{n+1}}$.
I'm trying to simplify the formula, but I keep failing.
Is there anyone to help me out?
I think this is actually much simpler than it seems.
First, note that for any real numbers $a,b,c$, we have $$\Big| |a - b| - |a - c| \Big| \leq |b - c|\,.$$
To prove this, just break it down case-by-case.
This implies that $|R(\mu,\hat{\mu})| \leq |\mu - \hat{\mu}|$. The result then follows from the fact that $\hat{\mu} \to \mu$ as $n \to \infty$.