Least squares estimator in a stochastic process $\{Y_t\}$

204 Views Asked by At

Let $\{Y_t\}$ be a stochastic process such that $$\begin{cases}Y_t=\beta x_t+z_t\\z_t=\epsilon_t+\theta \epsilon_{t-1}\\\epsilon_t\sim WN(0,1)\end{cases}$$

where $WN$ means white noise (it's not a probability distribution) with $E[\epsilon_t]=0$ and $Var(\epsilon_t)=1$. The $x_t$ values are constants not random. Find the least squares estimator of $\beta$ and the variance of estimator.

What I did is $$Q=\sum (y_t-(\beta x_t+z_t))^2=\sum y_t^2-2y_t(\beta x_t+z_t)+(\beta x_t+z_t)^2$$ $$=\sum y_t^2-2y_t\beta x_t-2y_tz_t+\beta^2x_t^2+2\beta x_tz_t+z_t^2$$ $$\frac{\partial Q}{\partial \beta}=\sum -2y_tx_t+2\beta\sum x_t^2-\sum x_tz_t=0$$ $$\Leftrightarrow \hat{\beta}=\frac{\sum y_tx_t-\sum x_tz_t}{\sum x_t^2}$$

Taking the second derivative $$\frac{\partial^2Q}{\partial\beta\partial\beta}=\sum x_t^2>0\qquad \forall t$$

then $\hat{\beta}$ is a minimum point and is the least square estimator.

$$Var(\hat{\beta})=Var\big(\frac{\sum y_tx_t-\sum x_tz_t}{\sum x_t^2}\big)=\frac{1}{(\sum x_t^2)^2}Var(\sum y_tx_t-\sum x_tz_t)$$

$$=\frac{1}{(\sum x_t^2)^2}\big(Var(\sum y_tx_t)+Var(\sum x_tz_t)-2cov(\sum y_tx_t,\sum x_tz_t)\big)$$

$$=\frac{1}{(\sum x_t^2)^2}\Big(Var\Big[\sum (\beta x_t\Big(\epsilon_t+\theta\epsilon_{t-1}\Big)\Big]+Var\Big[\sum x_t\Big(\epsilon_t+\theta\epsilon_{t-1}\Big)\Big]-2cov\Big[\sum (\beta x_t\Big(\epsilon_t+\theta\epsilon_{t-1}\Big),\sum x_t\Big(\epsilon_t+\theta\epsilon_{t-1}\Big)\Big]\Big)$$

Is there any mistake in the estimator?