Use Bayes method to solve ODE system with random noise

101 Views Asked by At

For ODE system $\frac{du}{dt} = \beta u$, $t>0$, $u(0)=1$, where $\beta$ is unknown. But the solution to the system at t=1 up to some noise is known: $h :=u(1) + \zeta$, where $\zeta$ is a random variable in N(0,1).

Assume we know the a prior that $\beta$ follows a Gaussian distribution N(2,1).

So estimate $\beta$ from observed h, I follow the Bayes method.

But here comes to the trouble. To find the posterior, I need the likelihood, which should be fund with the ODE and distribution of $\zeta$. But I just cannot figure out their relationship myself here.

So far, I understand that $f(\beta) = \frac{1}{\sqrt{2\pi}}e^{-\frac{(\beta - 2)^2}{2}}$

And I got a hint which I do not understand $f(h|\beta) = \frac{1}{\sqrt{2\pi}}e^{-\frac{(h - e^\beta)^2}{2\pi}}$

Could anyone please tell me what does this hint represent? And how can I move next step to find the posterior?

Thank you so so so much.

1

There are 1 best solutions below

0
On BEST ANSWER

What you're trying to do is called Maximum A Posteriori Estimation.
Namely, given a prior distribution $f$ over the parameter of interest $\beta$ and a likelihood function $f(h|\beta)$ for the quantity $h$ given $\beta$, we want to compute the most likely value of $\beta$ given our observation $h$. In other terms, we're looking for $$\hat \beta = \underset{\beta}{\arg \max} \ \ \ f(\beta|h) $$

So, to be clear (using your notations) :

  • $f(\beta) = \frac{1}{\sqrt{2\pi}}e^{-\frac{(\beta - 2)^2}{2}}$ is the prior distribution over $\beta$, i.e. we make the arbitrary assumption that it is normally distributed with mean $2$ and variance $1$.
  • $f(h|\beta) = \frac{1}{\sqrt{2\pi}}e^{-\frac{(h - e^\beta)^2}{2\pi}}$ is the likelihood function. It represents (loosely speaking) the probability to observe that value of $h$ when you know the value of $\beta$. Here for instance, $u(1) = e^\beta$ so $h = u(1) + \zeta$ is a standard normal random variable "shifted" by $e^\beta$. In other words, given the value of $\beta$, $h$ is a normal random variable with mean $e^\beta$ and variance $1$, which explains the formula you're given for its distribution.
  • Finally, $f(\beta |h)$ is the posterior distribution over $\beta$ given $h$. It represents (loosely speaking once again) the probability of the value of $\beta$ given the value of $h$. We can compute it using Bayes's theorem : $$f(\beta |h) = \frac{f(h|\beta)f(\beta)}{f(h)} \propto f(h|\beta)f(\beta) $$ The $f(h)$ term in the denominator represents the probability distribution of $h$, and we don't know how to compute it. However, it doesn't matter too much since it is independent of $\beta$, so it being here or not won't influence the $\arg\max$.

So we're looking for the value $\hat\beta$ that maximizes the posterior distribution written above, and that is the value that we'll take as our estimate of the true $\beta$. You now have all the elements you need to compute it using standard analysis tools : $$ \begin{align}\hat \beta &= \underset{\beta}{\arg \max} \ \ \ f(\beta|h) \\ &= \underset{\beta}{\arg \max} \ \ \ f(h|\beta)f(\beta) \\ &= \underset{\beta}{\arg \max} \ \ \ \frac{1}{\sqrt{2\pi}}e^{-\frac{(h - e^\beta)^2}{2\pi}} \times \frac{1}{\sqrt{2\pi}}e^{-\frac{(\beta - 2)^2}{2}} \end{align}$$