Linear Regression with lagged dependent variable question

28 Views Asked by At

this is my first time posting here so I hope I get everything right. I'm taking a basic course on linear regression but don't really understand a problem I've come across. In fact, I don't understand a lot of problems I've come across. Thing is, I've come into this without much math background and I'm really finding it tough.

We have a model where $y_t = ay_{t-1} + u_t$ where $|a|<1$ and $y_0 = 0$ (which is an example of a "random walk"?). Also have the following assumptions: $E[u_t|y_{t-1}]=E[u_t]=0$ and $var[u_t]=\sigma^2$ and $E[u_t u_s] = 0$ where $t\not=s$.

The question is about estimating $a$ using OLS. The first part asks me to write $y_t$ in terms of $u_{t-1}, u_{t-2}, ...$ which I find to be:

$y_t = \sum_{i=0}^{t-1} a^iu_{t-i}$

By just using backward substitution. It then asks to "hence show" that it may be the case that:

$E[(\sum_{t=1}^{T} y_{t-1}u_t)(\sum_{t=1}^{T} y_{t-1}^2)] \not= 0$

But I do not really know how to proceed, nor what I am doing. Looking at $E[\sum_{t=1}^{T} y_{t-1}u_t]$ I realise that this is not like the usual case since $E[u_t|y] \not=0$ as $E[u_t|y_{t+1}] \not=0$. So here it looks as if estimation by OLS will be biased.

But I still don't really understand what to do: I know I need to use my $y_t = \sum_{i=0}^{t-1} a^iu_{t-i}$ but I just get in a mess. If anyone can post any pointers I'd be really great, this looks like an awesome community. Hope the post is formatted/explained OK.

Thanks.