How to estimate the axis of symmetry for an even function with error?

56 Views Asked by At

I have a situation here, where, for an unknown $t$, and an unknown but nice* real function $f$, for which $x\rightarrow f(x-t)$ is even, I measure $f(x) + \epsilon_x$, where $\epsilon_x$ is some kind of more or less random error, hopefully small.

Now, I am looking for a sensible estimator for $t$.

I note that my situation is not well defined, but is there some theory I might look into that adresses this or similar situations?

*: Having all properties you desire to get a meaningful result.

2

There are 2 best solutions below

1
On BEST ANSWER

Consider $$ g(A, s) = \int_{-A}^A [h(s-x) - h(x)] dx $$ where $h = f + \epsilon$ is your observed function.

Assuming for the moment that that $\epsilon = 0$, $A \mapsto g(A, s)$ is a function that's identically zero when $s = t$, and unlikely to be zero elsewhere.

As you add epsilon back in, the function $ A \mapsto g(A, s)$, at $s = t$, looks more like a function bounded by something like $cA$, but in fact staying generally close to zero (it's an integral of noise, after all) but for $s \ne t$, you expect it to grow rather more rapidly (assuming $f$ wasn't chosen by an adversary who knows how to make integrals small :) ).

If I had actual data, I'd estimate the integrals for a few values of $A$, say, $A = c, 2c, 4c, ...$ and plot the result as a function of $s$ and see how things look.

I'm not really saying much here except that "you can try all possibilities and this is an easy way to find some promising ones without actually getting up close and personal with the data."

2
On

Since it looks like you are going to measure some physical quantity, it seems fair to assume that $x=t$ is a local maximum or minimum for $f$. This would not be the case with a function like $$f(x)=x^3{\sin{\frac{1}{x}}}$$ that is even, differentiable but doesn't exhibit a maximum or a minimum at $x=0$.

If you have a rough idea of where the simmetry axis $x=t$ is and how does $f$ behave in the neighborood of $t$, you could try to look for the zero of the first derivative, by measuring the ratios: $$\Delta y /\Delta x.$$ More precisely, you can choose some points $x_1,x_2,x_3,\dots$ around $t$ at some fixed distance $D$ and measure $$r_i=\dfrac{y(x_i+\frac{\delta}{2})-y(x_i-\frac{\delta}{2})}{\delta},$$ with $\delta \ll D$. If you expect only one zero for the derivative in the range of the $x_i$'s, then the $x_i$ with the smallest $r_i$ would give an extimation of $t$ ($r_i$ will be an increasing function of $x_i$). The uncertainty associated to this $t$ would be $\Delta t \sim D$, the details depending on how you do the measure and the analysis.


Maybe a more practical way would be to take several points around the expected $t$ and to perform a quadratic (or quartic) fit of the form: $$y_i = ax_i^2+bx_i+c,$$ obtaining $$t=-\frac{b}{2a}.$$ You can do this with the aid of a software like QtiPlot or, if you are very patient, also by hand. This method is applicable if $$|\dfrac{\text d y}{\text d x}|_{x=x_i}\sigma _{x_i}\ll \sigma _{y_i}.$$ If you can give a good extimation of the uncertainties $\sigma _{y_i}$, the fit will also give the uncertainties for the parameters $a,b,c$ and you can test the goodness of the quadratic approximation with a $\chi ^2$ test.