Let $U$ be a $U$-statistic, $\gamma=\mathbb{E}(U)$ and $U^{*}=U-\gamma$. What can we say about Convergence of $\sqrt{n}[U^{*}-\gamma]$?

56 Views Asked by At

$\boxed{\mbox{Context of my question}}$

We are located in the theory of $U$-statistics, in this sense, we define the following concepts:

Definition 1: A parameter $\gamma$ is said to be estimable of degree $r$ for the family of distributions $\mathscr{F}$ if $r$ is the smallest sample size for whic there exists a symmetric function $h(x_{1},\ldots,x_{r})$ such that $$\mathbb{E}_{F}[h(X_{1},\ldots,X_{r})]=\gamma$$for every distribution $F(\cdot)\in \mathscr{F}$, where $X_{1},\ldots, X_{r}$ denotes a random sample from $F(\cdot)$ and $h$ is a statistic and thus does not depend on $F(\cdot)$. The function $h$ is called symmetric kernel of the parameter $\gamma$.

Now, suppose we have a random sample $X_{1},\ldots,X_{n}$, $n\geq r$, from a distribution with c.d.f. $F(\cdot)\in \mathscr{F}$. Naturally, we want to use all $n$ ovservations in constructing an unbiased estimator of $\gamma$.

Definition 2: A $U$-statistic for the estimable paremeter $\gamma$ of degree $r$ is created with the symetric kernel $h(\cdot)$ by forming $$U(X_{1},\ldots,X_{n})=\frac 1 {\binom n r} \sum_{\beta\in B} h(X_{\beta_1},\ldots,X_{\beta_r}) \tag 1$$ where $B=\left\{\beta\mid\beta\right.$ is one of the $\binom{n}{r}$ unordered subsets of $r$ integers chosen without replacement from the set $\left\{1,\ldots,n\right\} \left.\right\}$.

The following important theorem due to Hoeeffding (1948) establishes the asymptotic normality of standardized one-sample $U$-statistics.

Theorem 1: Let $X_1,\ldots, X_n$ denote a random sample from some population. Let $\gamma$ be an estimable parameter of degree $r$ with symmetric kernel $h(x_1,\ldots,x_r)$. If $\mathbb{E}[h^2(X_1,\ldots,X_n)]<\infty$ and if $U$ is defined as $(1)$, then $$\sqrt{n}[U(X_1,\ldots,X_n)-\gamma]$$ has a limiting normal distribution with mean $0$ and variance $r^2 \xi_1$, provided $$\xi_1:=\mathbb{E}[h(X_1,X_2,\ldots,X_r)h(X_1,X_{r+1},\ldots,X_{2r-1})]-\gamma^2$$ is positive.

In this context, we define $$U^{*}:=U-\mathbb{E}[U]=U-\gamma.$$

$\boxed{\mbox{My problem}}$

Let $X_{1},\ldots, X_{n}$ be i.i.d. random variables with c.d.f. $F(x)$. Let $p=\mathbb{P}[X_{1}>0]$ and set $\gamma=p(1-p)$. I find a $U$-statistic estimator of $\gamma$, this is dtermined by symmetric kernel $$h(x_{1},x_{2}):=\frac{1}{2}[\Psi(x_{1})\Psi(-x_{2})+\Psi(x_{2})\Psi(-x_{1})]$$ where $\Psi(x)=1$ if $x>0$ and $\Psi(x)=0$ if $x\leq 0$. In this sense, we have $$U(X_{1},\ldots,X_{n})=\frac{1}{\binom{n}{2}}\sum_{\beta\in B}h(X_{\beta_1},X_{\beta_2})=\frac{1}{n(n-1)}\sum_{i=1}^n \sum_{j=i+1}^n [\Psi(x_i)\Psi(-x_j)+\Psi(x_j)\Psi(-x_i)].$$
After doing all the calculations, we also have the following $$\xi_1=\frac{p(1-p)}{4}-p^2(1-p)^2$$ and $$\operatorname{Var}[U(X_{1},\ldots, X_{n})]=\frac{(2n-3)}{2n(n-1)}p(1-p)(2p-1).$$ I expound the last two terms because I think that would be useful to answer my question.

The problem that I want to formulate I found in a book of non-parametric statistics, the problem says the following:

Apply the Theorem 1 to $U^{*}$ and determine the form of the limiting variance of $\sqrt{n}[U^{*}-\gamma]$.

$\boxed{\mbox{Remark:}}$ I know that from the Theorem 1 we can say $\sqrt{n}U^{*}$ has a limiting normal distribution with mean $0$ and variance $r^{2}\xi_{1}$, but I do not know how this can help me determine something about the convergence of $\sqrt{n}[U^{*}-\gamma]$. I have tried to write $\sqrt{n}[U^{*}-\gamma]$ differently and can write it as follows: $$\sqrt{n}[U^{*}-\gamma]=\sqrt{n}[2U^{*}-U].$$ I do not know if this last helps in anything.