On Kalman Filtering (Brownian Motion, Martingales and Stochastic Calculus by Le-Gall)

241 Views Asked by At

Im trying to solve exercise 1.16 in Le Gall's book Brownian Motion, Martingales and Stochastic Calculus.

Let $(\epsilon_n)_{n \geq 0}$ and $(\eta_n)_{n \geq 0}$ be two independent sequences of independent Gaussian random variables such that, for every $n$, $\epsilon_n$ is distributed $\mathcal{N}(0,\sigma^2)$ and $\eta_n$ is distributed $\mathcal{N}(0,\delta^2)$. Consider two sequences indexed on $n \geq 0$, $(X_n)$ and $(Y_n)$ such that $X_0 = 0$ and $X_{n+1} = a_n X_n + \epsilon_{n+1}$ and $Y_n = c X_n + \eta_n$ where $c$ and $a_n$ are positive constants. We set

$$\hat{X}_{n/n} = E(X_n \mid Y_0, Y_1, \ldots, Y_n) $$ $$\hat{X}_{n+1/n} = E(X_{n+1} \mid Y_0, Y_1, \ldots, Y_n) $$ The goal is to find a recursive formula allowing to compute these two expectations.

  1. Verify that $\hat{X}_{n+1/n} = a_n \hat{X}_{n/n}$ for every $n\geq0$.
  2. Show that for every $n\geq1$, $$\hat{X}_{n/n} = \hat{X}_{n/n-1} + \frac{E(X_n Z_n)}{E(Z_n^2)} Z_n$$ where $Z_n := Y_n - c \hat{X}_{n/n-1}$.
  1. Is very easy, mainly concluded by the linearity of conditional expectation. Im struggling with 2., I think it has to do about projections on $L^2$.