Understanding a step in a proof involving an empirical process (machine learning)

91 Views Asked by At

I am unable to understand the apparently simple step that yield the display appearing before Lemma B.1. in page 12 of this paper by Chernozhukov et al. (2016). In particular I fail to see how (2.1) implies the equation in the display.

You do not need to know anything besides:

  1. the notation, which is summarized on top of the same page,
  2. the fact that $\psi(w;\theta_{0},\eta)=0$ if and only if $\eta=\eta_{0}$,
  3. the fact that $I_k ^c$ is conditioned upon in the computation, so that $\hat{\eta}_0 (I_k ^c)$ is not random (i.e. the integration addresses only the $w$ component).