Nearest vector among vectors with a constant average

170 Views Asked by At

I meet the following problem:

Given $\bar{x}\in \mathbb{R}^n$, and a set of vectors $x\in \mathbb{R}^n$ with average $c$, i.e, $$\frac{\sum_{j=1}^n x_{j}}{n}=c, \ \ \ \ \forall x$$

I want to find the nearest vector $x$ to $\bar{x}$. Is there any formula for $\bar{x}$ to solve this problem?

Thanks in advance.

1

There are 1 best solutions below

1
On

You cannot, in the general case, determine the vector nearest to the given $\bar{x}$ without examining at least $n-1$ vectors. The reason is that it is always possible for the next-to-last vector to have the value such that the last vector needs to be exactly $\bar{x}$ in order that the average comes out to $c$.

(BTW, did you mean the $n$ in the summation to be the same as the $n$ describing the dimensionallity of the vector space?)

A related, and quite interesting, question is this: Given that the $x_i$ vectors were chosen as normally distributed random vectors in $\Bbb{R}^n$, and that you are told their mean $\vec{c}$, find an algorithm for optimizing the following probability of success:

You look at each vector one at a time, and at some point bet on the last vector seen to be the closest to $\vec{c}$. (Your algorithm tells you how to decide whether to choose the last vector). You have succeeded if it indeed is the closest. You fail by betting on the wrong vector, or passing up the closest vector so that after that you never find another that you could sensibly bet on.

The analogous problem involving a sequence of numbers is well known, but the wrinkle of placing it in $\Bbb{R}^n$ and giving information about the overall mean might change the optimal algorithm (which in the well-known 1-D case is to bet on the first possibly-closest number after $1/e$ of the numbers have already been seen).