Creating an epsilon of room to prove convergence in mean

223 Views Asked by At

Let $X$ be a random variable taking values in $\mathbb{R}^d$. Denote $X_{(i)}(x)$ the $i$th closest point to $x$ (in $\ell_2$ norm) in an iid sample of size $n$ from the distribution of $X$.

Show that for any integrable function $f: \mathbb {R}^d \rightarrow \mathbb R$:

$$\frac{1}{k}\sum^k_{i=1}\mathbb{E}\left \{ \left | f(X) - f(X_{(i)}(X)) \right |\right \} \rightarrow 0$$

as $n \rightarrow \infty$ whenever $k/n \rightarrow 0$

What I've tried

I'd like to apply the Giving an epsilon of room trick from tricki: http://www.tricki.org/article/Create_an_epsilon_of_room

I'll introduce a uniformly continuous function $f_e$ with bounded support that approximates $f$ such that:

$$\mathbb{E}\{|f_\epsilon(X) - f(X)|\} \leq \epsilon$$

This can be done since the space of integrable functions has the set of uniformly continuous functions with bounded support as a dense subsets. We will show the claim for $f_\epsilon$. By uniform continuity for each $\eta > 0$ there exists a $\nu > 0$ such that for all $\delta \in \mathbb{R}^d, \|\delta\|_2 \leq \nu$:

$$\mathbb{E}\{|f_\epsilon(X) - f_\epsilon(X + \delta)|\} \leq \eta$$

So we can show that each term in the summation goes to $0$: $$\mathbb{E}\{|f_\epsilon(X) - f_\epsilon(X_{(i)}(X))|\} \leq \mathbb{E}\{|f_\epsilon(X) - f_\epsilon(X + \delta)|\} + 2 \|f_\epsilon\|_\infty \mathbb{P}(\|X - X_{(i)}(X)\| > \nu)$$

Both terms on the RHS can be made arbitrarily close to $0$.

Now we'll show that $\mathbb{E}\{|f_\epsilon(X) - f_\epsilon(X_{(i)}(X))|\}$ is close to $\mathbb{E}\{|f(X) - f(X_{(i)}(X))|\}$:

$$|\mathbb{E}\{|f_\epsilon(X) - f_\epsilon(X_{(i)}(X))|\} - \mathbb{E}\{|f(X) - f(X_{(i)}(X))|\}| \leq |\mathbb{E}\{|f_\epsilon(X) - f_\epsilon(X_{(i)}(X)) - f(X) + f(X_{(i)}(X))|\}|$$

By the reverse triangle inequality. Then an application of the triangle inequality gives:

$$\leq \mathbb{E}\{|f_\epsilon(X) - f(X)|\} + \underbrace{\mathbb{E}\{|f_\epsilon(X_{(i)}(X)) - f(X_{(i)}(X))|\}}_{(A)}$$

each of which is at most $\epsilon$ by construction.

I could use some feedback on my solution and reasoning.

EDIT: It's not clear to me that the quantity $(A)$ is at most $\epsilon$ since $X_{(i)}(X)$ does not have the same distribution as $X$ (and the condition on $f_\epsilon$ involves the distribution of $X$ not $X_{(i)}(X)$).

EDIT 2: We know that $X_{(i)}(X) \rightarrow^p X$, but without a continuity assumption on $f$, I can't say anything about the distribution of $f(X_{(i)}(X))$. In comparison, I know that $f_\epsilon(X_{(i)}(X)) = f_\epsilon(X) + o_p(1)$.

EDIT 3: Alternatively, it is sufficient for $f_\epsilon$ to converge to $f$ almost uniformly. Although convergence in L1 does not imply pointwise convergence, we can select a subsequence of $f_\epsilon$ for which we do have pointwise convergence. By Egorov's theorem, this implies almost uniform convergence.