I have a situation here, where, for an unknown $t$, and an unknown but nice* real function $f$, for which $x\rightarrow f(x-t)$ is even, I measure $f(x) + \epsilon_x$, where $\epsilon_x$ is some kind of more or less random error, hopefully small.
Now, I am looking for a sensible estimator for $t$.
I note that my situation is not well defined, but is there some theory I might look into that adresses this or similar situations?
*: Having all properties you desire to get a meaningful result.
Consider $$ g(A, s) = \int_{-A}^A [h(s-x) - h(x)] dx $$ where $h = f + \epsilon$ is your observed function.
Assuming for the moment that that $\epsilon = 0$, $A \mapsto g(A, s)$ is a function that's identically zero when $s = t$, and unlikely to be zero elsewhere.
As you add epsilon back in, the function $ A \mapsto g(A, s)$, at $s = t$, looks more like a function bounded by something like $cA$, but in fact staying generally close to zero (it's an integral of noise, after all) but for $s \ne t$, you expect it to grow rather more rapidly (assuming $f$ wasn't chosen by an adversary who knows how to make integrals small :) ).
If I had actual data, I'd estimate the integrals for a few values of $A$, say, $A = c, 2c, 4c, ...$ and plot the result as a function of $s$ and see how things look.
I'm not really saying much here except that "you can try all possibilities and this is an easy way to find some promising ones without actually getting up close and personal with the data."