I've been working my way through A step-by-step tutorial on active inference and its application to empirical data, by R Smith, KJ Friston, CJ Whyte, and started to bog down on Figure 4, specifically the update equation for Dynamic Perception. $$s_{\tau=1}=\sigma\big(\frac{1}{2} (\ln D + \ln B^T s_{\tau+1}) + \ln A^T o_\tau\big)$$
Most of the equation sort-of makes sense, with the exception of the $\frac{1}{2}$. I suspect that $B^T s_{\tau+1}$ functions as an "alternative prior" alongside $D$ and the $\frac{1}{2}$ is there to weight them equally. Does this make sense? Is there a better way to explain it?
First, I have found a more recent version of the paper. I have also found this sentence on page 41 of the revised version
In short, it is a tweak. I have some more work to do, but at least I know where to look. I'll close the question. If anyone sle runs into the same problem let me know and I'll add whatever I've learned in the meantime.