Bayesian Filtering Smoothing over multiple classes

56 Views Asked by At

I am classifying images over time in categories such as office, bathroom, living room and so on. The idea is to use all these classification to categorize the room where a robot is.

I want to use a Bayesian filter or a similar technique that updates the current beliefs of each room categorie, based on the last belief of each categorie and on the new classifications acquired in the last image. So, denoting:

X={x0,x1,..,xn}, the scene classes (which in this case are 10).

It Classification results for each of the above classes at time t (normalized between all the categories)

What i want to calculate is:

p(Xt|I1:t) The vector with the probabilities for each class given my last prediction p(Xt-1|It-1), and the current classification p(Xt|It).

$$p(X_t|I_{1:t}) = {p(X_t|I_{t}) \times p(X_t|I_{1:t-1}) \over \int{p(I_{t}|X_t})p(X_t|I_{t-1})dI_t}.$$

The nominator as I understood works as a normalization unit but I am not figuring out how to get it. I do not know $ p(I_{t}|X_t) $, how can I model it? Using this bayesian filter will I obtain a probability distribution over all the categories after the updates?