Show whether or not $\hat\beta$ is a consistent estimator

293 Views Asked by At

I have the following model:

$y_i=\mathbf x_i'\beta+\epsilon_i $

$E(\mathbf x_i\epsilon_i)=0$

Now, assume there is a positive function $g(x)$, let $g_i=g(\mathbf x_i)$. Consider the estimator:

$$\hat\beta=(\sum_{i=1}^n g_i\mathbf x_i\mathbf x_i’)^{-1}(\sum_{i=1}^n g_i\mathbf x_iy_i)$$

I want to find the probability limit of this estimator $\hat\beta$ as $n\to\infty$. Also, is $\hat\beta$ consistent for $\beta$? If not, under what assumption is $\hat\beta$ consistent for $\beta$? Any comment would be helpful!

2

There are 2 best solutions below

0
On

$\hat{\beta}=\left(1/n\sum g_ix_ix_i'\right)^{-1}\left(1/n\sum g_ix_iy_i'\right)\rightarrow_p \left(E(g(X)XX')\right)^{-1}\left(E(g(X)XY)\right)\text{( by WLLN)}=\left(E(g(X)XX')\right)^{-1}\left(E(g(X)XX'\beta)\right)+\left(E(g(X)XX')\right)^{-1}\left(E(g(X)X\epsilon)\right)\\=\beta+\left(E(g(X)XX')\right)^{-1}\left(E(g(X)X\epsilon)\right)$

If $E(g(X)X\epsilon)=0$ then $\hat{\beta}$ is consistent for $\beta$.

0
On

\begin{align} \hat{\beta}_n & = (\sum_{i=1}^n g_i\mathbf x_i\mathbf x_i’)^{-1}\sum_{i=1}^n g_i\mathbf x_iy_i\\ & = (\sum_{i=1}^n g_i\mathbf x_i\mathbf x_i’)^{-1}\sum_{i=1}^n g_i\mathbf x_i(\mathbf x_i’\beta + \epsilon_i)\\ &= (\sum_{i=1}^n g_i\mathbf x_i\mathbf x_i’)^{-1}\sum_{i=1}^n g_i\mathbf x_i\mathbf x_i’\beta + (\sum_{i=1}^n g_i\mathbf x_i\mathbf x_i’)^{-1}\sum_{i=1}^n g_i\mathbf x_i\mathbf\epsilon_i\\ &=\beta+o_p(1) \end{align} where the last term is a result of the WLLN on $$ 1/n\sum_{i=1}^n g_i\mathbf x_i\mathbf\epsilon_i \xrightarrow{p} 0 $$ as $\epsilon_i$ and $\mathbf{x}_i$ are uncorrelated. And $$ (1/n\sum_{i=1}^n g_i\mathbf x_i\mathbf x_i’)^{-1} \xrightarrow{p}(\mathbb{E}g_i\mathbf x_i\mathbf x_i’)^{-1} $$ using WLLN and the continuous mapping theorem.