Suppose I have the following Bayesian Network:
It's given by the following relations: $$\begin{aligned}X_1&\sim \mathcal N(\mu, 1/\sigma^2)\\ \forall k, 2\leq k\leq n: X_k|X_{k-1}&\sim \mathcal N(x_{k-1}, 1/\lambda^2)\\ \forall i, 1\leq i\leq n: Z_i|X_i&\sim\mathcal N(x_{i}, \text I_i/\delta^2) \end{aligned}$$
It is possible to find the expected value of $\boldsymbol X|\boldsymbol Z$ by minimising the following error function:
$$\sum_{i=1}^n\text I_i(x_i-z_i)^2 + \frac{\delta^2}{\lambda^2}\sum_{i=2}^n(x_i-x_{i-1})^2 + \frac{\delta^2}{\sigma^2}(x_1-\mu)^2$$
(You can find the proof here.)
The solutions will necessarily follow these equalities:
$$\begin{aligned} \hat x_1 &= \frac{\frac{\delta^2}{\lambda^2}\hat x_2 + \text I_1z_1 + \frac{\delta^2}{\sigma^2}\mu}{\frac{\delta^2}{\lambda^2} + \text I_1 + \frac{\delta^2}{\sigma^2}} \\ \hat x_n &= \frac{\text I_nz_n + \frac{\delta^2}{\sigma^2}x_{n-1}}{\text I_n + \frac{\delta^2}{\sigma^2}} \\ \forall i \in \{2, 3, ..., n-1\}: \hat x_i &= \frac{\frac{\delta^2}{\lambda^2}(\hat x_{i + 1} + \hat x_{i - 1}) + \text I_iz_i}{2\frac{\delta^2}{\lambda^2} + \text I_i } \end{aligned}$$
It's possible to find $\hat x_n$ by replacing $\hat x_1$ in $\hat x_2$ (and then $\hat x_2$ will only depend on $\hat x_3$), then replacing $\hat x_2$ in $\hat x_3$, and so on, until we reach $\hat x_n$, at which point it's numbers all the way down and we have an actual value.
Is there an easier way to solve this, though? Or an implementation of this algorithm in R or something? I've been trying to think of an R implementation of this that doesn't consist of fully solving the optimisation problem and then taking the last component of the resulting vector, since that takes longer than I'd wish, and given that there's an "analytical" solution to the problem I'd hope to be able to figure out how to implement it.
