Stable coordinates in Indoor localization

16 Views Asked by At

I am implementing an Indoor localization demo for a project. In that i am using ultra wide band technology to calculate the distances from three different anchors to the moving object, then I use least square method with trilateration to calculate the coordinate of the moving object. However due to some environmental factors (may be objects) , distance values are changing little bit even the object is not moving. Therefore coordinates are also changing little bit.

So can you suggest a mathematical model (or algorithm) to minimize this coordinate instability and get better smooth coordinate output.

Thanks in Advance !!!

1

There are 1 best solutions below

0
On

I suggest you use maximum likelihood inference, which is a very general and powerful technique, as long as you have a statistical model for the behavior of the object and for the observations.

Let me start by illustrating the technique, if you know the object is not moving. Specifically, if $x$ represents the true position of the object, and $y=(y_1,\dots,y_k)$ represent observations (e.g., $y_i$ contains the three distances in the $i$th measurement), define the likelihood $p(x|y)$ as the probability that its true position is $x$, given that you've observed $y$. Then, use optimization methods to compute

$$\max_{\hat{x}} p(\hat{x}|y).$$

If you have a "noise model", i.e., a mathematical model for $p(y_i|x)$ (for example, maybe $y_i$ is a multivariate Gaussian with mean given by the distances between $x$ and the three anchors and some known standard deviation), then you can use Bayes rule and an independence assumption to express $p(x|y)$ as a function of the $p(y_i|x)$'s and the prior for $x$, and then use that in the optimization above. Specifically,

$$p(x|y) = {p(y|x) p(x) \over p(y)} = {p(x) \prod_i p(y_i|x) \over p(y)}.$$

Here $p(y)$ doesn't depend on $x$, so it can be treated as a constant for the purposes of the optimization. Thus, the above optimization problem is equivalent to

$$\max_{\hat{x}} p(\hat{x}) \prod_i p(y_i|\hat{x}).$$

It's often easier to work with log-likelihoods, and in particular, this is in turn equivalent to solving the optimization problem

$$\max_{\hat{x}} \log p(\hat{x}) + \sum_i \log p(y_i|\hat{x}).$$

Given a noise model, you should be able to write an explicit expression for $\log p(y_i|x)$, and if you assume a uniform prior for $x$, then you can write an explicit expression for $\log p(x)$, so you can plug that in above and then maximize their sum using any off-the-shelf optimization algorithm.


More generally, you could make some more general assumptions. For instance, you could assume the object is moving in a straight line. This would imply that its position $x$ at the $i$th observation is given by $x=x_0 + v t_i$ where $x_0$ is its initial position, $v$ is its velocity, and $t_i$ is the time at which the $i$th observation is taken. Now the hidden parameters to infer are $\theta=(x_0,v)$. You will need a prior on $\theta$ (for instance, perhaps you assume a uniform or Gaussian prior on $x_0$ and you assume that with probability $1/2$, $v=0$, and with probability $1/2$, $v$ is sampled from some Gaussian or something). Finally, then you can formulate the maximum likelihood inference problem as finding $\hat{\theta}$ that maximizes the likelihood $p(\hat{\theta}|y)$.

Hopefully you can see how this is a flexible approach that can generalize to many different assumptions about your particular application.