I am looking for a efficient way to find the value of k that minimizes
$\sum(s_t - b_{t+k})^2$
where $s$ and $b$ are N-dimensional vectors and the values are wrapped around like this:
$b_{t+k} := b_{(t+k)-N}$ if $(t+k) > N$
$b_{t+k} := b_{(t+k)+N}$ if $(t+k) < 1$
I need this for some algorithm and at the moment I do not have any better idea than to try all possible values of $k$.
Here is phase estimate in blue and certainty measure in red. Y axis is estimated displacement for blue and x is frequency with DC being in the middle.

First, note that if you multiply out the square, the two sums of squared terms don't depend on $k$, so you're effectively maximizing $\sum_ts_tb_{t+k}$. If you read up a bit on convolutions and discrete Fourier transforms, you'll find that they allow you to find that maximum in $N\log N$ time instead of the $N^2$ time it takes you to try out all values of $k$.
Edit with a rough sketch of what to do: Perform a Fourier transform on both sequences, multiply the one transform with the complex conjugate of the other (conjugating the transform corresponds to inverting the original sequence in order to write your sum as a convolution), then perform an inverse Fourier transform on the product. That gives you your cross-correlation sum as a function of $k$, and you can read off the optimal $k$ from the value with the highest magnitude.