Autocorrelation problem, regression analysis

149 Views Asked by At

Bit stuck on my econometrics course (old exam q), not big on mathematical statistics, anyway this is the problem:

Given some model $y_{it}=\beta_0+\beta_1x_{it}+u_{it}$ and suppose that the idiosyncratic errors are serially uncorrelated with constant variance i.e. $var(u)=\sigma^2$, $E(u)=0$ $corr(u_{it},u_{is})=0\: \forall \: t\neq s $. Show that $corr(u_{it}-u_{it-1},u_{it-1}-u_{it-2})=-0.5$.

I've been trying to jerk around a bit with expected value formulas of variance/covariance, but I keep hitting a dead end. Also note that since this is taught by the econ department my matrix algebra in this area is poor, so it would be more helpful if it were dissected in "simple terms".

Any help appreciated!

Thanks

/I

2

There are 2 best solutions below

0
On BEST ANSWER

Separate the terms. Then you get:

$$ Cov(u_{it}-u_{it-1}, u_{it-1}-u_{it-2})=Cov(u_{it}, u_{it-1})-Cov(u_{it}, u_{it-2})-Cov(u_{it-1}, u_{it-1})+Cov(u_{it-1}, u_{it-2}) $$ The first term is 0 by assumption. So are the second and fourth ones as well. The third one is $$ -Cov(u_{it-1}, u_{it-1})=-V(u_{it-1})=-\sigma^2. $$

So then $$ Corr(u_{it}-u_{it-1}, u_{it-1}-u_{it-2})=\frac{Cov(u_{it}-u_{it-1}, u_{it-1}-u_{it-2})}{\sqrt{V(u_{it}-u_{it-1})V(u_{it-1}-u_{it-2})}}=-\frac{\sigma^2}{\sqrt{4(\sigma^2)^2}}\\ =-\frac{1}{2}. $$

0
On

Ah got it now, silly me: $corr(u_t, u_{s})=0$ implies $cov(u_t, u_{s})=0$ implies $E(u_t u_s)=0$

$corr(\Delta u_t, \Delta u_{t-1})=\frac{cov(\Delta u_t, \Delta u_{t-1})}{sd(\Delta u_t)sd(\Delta u_{t-1})}=\frac{E[( u_t- u_{t-1})( u_{t-1}- u_{t-2})}{var(\Delta u_t)}$

Multiplying out the above expected value gives a bunch of $E(u_t u_s)=0$ terms and $-\sigma^2_u$ so we get $corr(\Delta u_t, \Delta u_{t-1})=\frac{-\sigma^2_u}{var(\Delta u_t)}=\frac{-\sigma^2_u}{2\sigma^2_u}=-0.5$ As desired.