Least Square Estimator Derivation for 2-Dimensional Stochastic Process

82 Views Asked by At

I am trying to work through an example in this paper Least squares estimators for discretely observed stochastic processes

The authors give the following

$$ \Psi_{n,\epsilon}(\theta) = \sum_{k=1}^n \frac{\lvert X_{t_k} - X_{t_{k-1}}-b(X_{t_{k-1}},\theta)\Delta_{t_{k-1}}\rvert^2}{\epsilon^2\Delta_{t_{k-1}}} $$ of which minimizing gives the Least Square Estimator.

The example I am working through is as follows:

Example 2.9

The authors state after some basic calculations they achieve the LSE; however, it doesn't seem so basic. If I try to write out $\Psi_{n,\epsilon}(\theta)$ in this case, then it becomes very messy quickly. Also, the matrix $\Lambda_n$ is not invertible, nor is it clear to me where they got that from. So I must be missing something as I assume the example is indeed correct.

Would appreciate if someone would enlighten me.

1

There are 1 best solutions below

3
On BEST ANSWER

Define $B^T=[C,A],\,\tilde{y}=[1,y^{(1)},y^{(2)}]^T$. Note, for $x,y \in \mathbb{R}^2$ $$\begin{aligned}|x-(C+Ay)n^{-1}|^2&=|x-B^T\tilde{y}n^{-1}|^2=\\ &=x^Tx-2x^TB^Tyn^{-1}+\tilde{y}^TBB^T\tilde{y}n^{-2}\end{aligned}$$ and by linearity $$\varepsilon^{-2}n\sum_{k\leq n}|x_k-(C+Ay_k)n^{-1}|^2=\varepsilon^{-2}n\sum_{k\leq n}x_k^Tx_k-2\varepsilon^{-2}\sum_{k\leq n}x_k^TB^T\tilde{y}_k+\varepsilon^{-2}n^{-1}\sum_{k\leq n}\tilde{y}^T_kBB^T\tilde{y}_k$$ take the derivative wrt to $B$ and set to $0$: $$-2\sum_{k\leq n}\tilde{y}_kx_k^T+2n^{-1}\bigg(\sum_{k\leq n}\tilde{y}_k\tilde{y}^T_k\bigg)B=0\implies B=\bigg(\sum_{k\leq n}\tilde{y}_k\tilde{y}_k^T\bigg)^{-1}\bigg(n\sum_{k\leq n}\tilde{y}_kx_k^T\bigg)$$ Now note $$\tilde{y}_k\tilde{y}_k^T=\begin{bmatrix}1&y^{(1)}_k&y^{(2)}_k\\ y^{(1)}_k&(y_k^{(1)})^2&y_k^{(1)}y_k^{(2)}\\ y^{(2)}_k&y_k^{(1)}y_k^{(2)}&(y_k^{(2)})^2 \end{bmatrix},\,\tilde{y}_kx_k^T= \begin{bmatrix}x^{(1)}_k&x^{(2)}_k\\ x^{(1)}_ky^{(1)}_k&x^{(2)}_ky_k^{(1)}\\ x^{(1)}_ky^{(2)}_k&x^{(2)}_ky_k^{(2)} \end{bmatrix}$$