Variance of parameter estimate using recursive least squares

591 Views Asked by At

I am learning about recursive least squares estimation using a forgetting factor $\lambda$ as a tool for treating time variations of model parameters and have become stuck on the following problem.

Question

Find an expression for $V\big[\hat{b}\big]$ given $$y_t = bu_t + e_t, \quad t=1,...,N$$

Where $e_t$ is white Gaussian noise with variance $\sigma^2_e$ and $u_t$ is a deterministic signal such that

$$\lim_{N\to\infty} \frac{1}{N} \sum_{t=1}^{N} u^2_t$$

is finite. The unknown parameter b is estimated as $$\hat{b}= \operatorname*{argmin}_b \sum_{t=1}^{N} \lambda^{N-t}(y_t-bu_t)^2,$$ where $0<\lambda \leq 1$.

My attempt at a solution

I can be seen that the argument that minimises the above equation is $\hat{b}= \frac{y_t}{u_t}$. However when I try to calculate the variance I get

$$V\big[\hat{b}\big]=V\big[\frac{y_t}{u_t}\big].$$ But as $u_t$ is a deterministic signal and I am under the impression that the variance of a deterministic signal is zero this would give me a zero in the denominator?

Any help greatly appreciated.

Edit

After user617446 comment I went back and recalculated $\hat{b}$ as follows-

$$\frac{\partial}{\partial b}\bigg[ \sum_{t=1}^{N} \lambda^{N-t}(y_t-bu_t)^2 \bigg] = 2b\sum_{t=1}^{N}\lambda^{N-t}u_t^2-2\sum_{t=1}^{N}\lambda^{N-t}y_tu_t$$

setting this equal to zero then solving gave

$$\hat{b}=\frac{\sum_{t=1}^{N}\lambda^{N-t}y_tu_t}{\sum_{t=1}^{N}\lambda^{N-t}u_t^2}.$$

I believe this to be correct but I am now stuck once again on how to calculate the variance? Grateful for any and all help.

1

There are 1 best solutions below

0
On

Firstly, the least square estmation can be found via differentiation of sum by the parameter $\hat b,$ so the expression $$\hat b =\dfrac{\sum\limits_{t=1}^N\lambda^{N-t}y_t u_t}{\sum\limits_{t=1}^N\lambda^{N-t}u_t^2}$$ is correct.

Parameter $\hat b$ should be considered as random variable whose value depends on the specific white noise sample, $$\hat b =\dfrac{\sum\limits_{t=1^N}\lambda^{N-t}(e_t+u_t b) u_t}{\sum\limits_{t=1}^N\lambda^{N-t}u_t^2} =b + \dfrac{\sum\limits_{t=1}^N\lambda^{N-t}e_t u_t}{\sum\limits_{t=1}^N\lambda^{N-t}u_t^2}.$$ There are not reasons why the sums ratio can deviate the average mean of the random variable $\hat b,$ so $$M(\hat b) = b.$$ Then the variance is $$V(\hat b) = M((\hat b-b)^2) = M\left(\left(\dfrac{\sum\limits_{t=1}^N\lambda^{N-t}e_t u_t}{\sum\limits_{t=1}^N\lambda^{N-t}u_t^2}\right)^2\right)\\[4pt] = \dfrac{M\left(\sum\limits_{t=1}^N \lambda^{2(N-t)}u_t^2e_t^2\right) +M\left(\sum\limits_{1\leq t_1 < t_2\leq N} \lambda^{2N-t_1-t_2}u_{t_1}u_{t_2}e_{t_1} e_{t_2}\right)}{\left(\sum\limits_{t=0}^N\lambda^{N-t}u_t^2\right)^2} = \color{brown}{\mathbf{\dfrac{\sum\limits_{t=1}^N \lambda^{2(N-t)}u_t^2}{\left(\sum\limits_{t=1}^N\lambda^{N-t}u_t^2\right)^2}\cdot\sigma_e^2}}.$$