Finding mutual information in discrete linear partial observation stochastic process

18 Views Asked by At

I have one basic question maybe is not to hard for you but I am a bit confused. Let our system be like this:

\begin{align} X_{k+1} &= A_k X_k + W_k \\ Y_k &= C_k X_k + V_k \end{align}

where $A_k$, $B_k$, and $C_k$ are constants and given (also we can take them equal for all round it is not the interest of my question). Let's even take $X_k$, $W_k$, and $V_k$ also Gaussian and independent discrete stochastic processes. Now I want to define $I_N$ as mutual information between $X_N$ and $Y_0$ to $Y_N-1$, and find the $\Delta I$ = $I_N - I_{N-1}$ so how can I find this? Should I use the Radon-Nikodym theory or the $E(X_n|F_{Y_{n-1}})$?