How to evaluate $|f(x+\varepsilon)-f(x)|$, where $f$ is elementary function and $\varepsilon$ is random variable?

55 Views Asked by At

Recently, I have derived a important indicator in my research on signal processing. Unfortunately, the later problem of theoretical analysis really gets me.

This indicatory is denoted as $y$ $$y=\frac{|a_1|-|a_2|}{|a_1|+|a_2|}$$ where $|a_1|$ and $|a_2|$ are amplitude of two signal.

In practice if I want to get $y$, first I should get the received strength of this two signal, denoted as $s_1$ and $s_2$ (surly $s_1,s_2>0$). Then if there is no noise, I can directly get $|a_1|$ and $|a_2|$ by $$ |a_1|=sqrt(s_1) $$ $$ |a_2|=sqrt(s_2) $$ and finally $y$ is calculated.

But there is always full of noise in real environment, which means that the signal strengths I receive are mixed with noise $$ s_1=|a_1+n|^{2} $$ $$ s_2=|a_2+n|^{2} $$ where n~CN(0,$\sigma^{2}$)

Suppose I can measure $s_1$ and $s_2$ only once. If I want to get y, what I can do to is taking z as a substitute of y. $$ z=\frac{\sqrt{s_1}-\sqrt{s_2}}{\sqrt{s_1}+\sqrt{s_2}}=\frac{|a_1+n|-|a_2+n|}{|a_1+n|+|a_2+n|} $$ If $\sigma$ is small enough, then $z$ and $y$ are almost close to each other.

But in more general situation, how can I evaluate the degree of closeness between $z$ and $y$ if the noise power $\sigma^{2}$ is known. It would be better if the analytic solutions or even approximate analytical solutions exist.

For example, if I use a confidence interval to evaluate the degree of closeness, first of all I should kown the distribution of z (i.e., the p.d.f. of z). On the condition of known $\sigma$, I may encounter a few problem below:

  1. confidence interval: $P(|z-y|<\frac{\epsilon}{2})=0.95$, $\epsilon$ describe the degree of closeness. $\epsilon$ is relay on a known distribution.

  2. p.d.f of z is related to $\sigma^{2}$, $|a_1|$ and $|a_2|$. Howerver $|a_1|$ and $|a_2|$ are unknown.

  3. Intuitively, $\epsilon$ would be bigger if either $|a_1|$ or $|a_2|$ is much less than $\sigma$. But how can I get this result by theoretical analysis?

  4. The value of $|a_1|$ and $|a_1|$ are related in practice. Suppose that $|a_1|=\sqrt{x}$ and $|a_2|=\sqrt{1-x}$, $x\sim U(0,1)$ (this is a close approximation in reality system in my reserch). Some extremes should be avoided. If the intuition of 3 is right and the theoretical analysis of 3 exist. I can reasonably redesign the system and make $x\sim U(0.25,0.75)$ (if you ask why not $x\sim U(0.1,0.9)$ or other value, what i can explain is that $x\sim U(0.25,0.75)$ is more easy to realize in my system design).

  5. Base on 3 and 4, a specific problem could be: Fixed $x$ decide a special p.d.f of $z$. Confidence interval come from this special p.d.f. So the interval length $\epsilon$ is decided by $x$ and $\sigma$, that is $$\epsilon=\epsilon(x,\sigma)$$ Give that $x\sim U(0.25,0.75)$, $E[\epsilon(x,\sigma)]_{p_{x}(x)}$ could be calculated or estimated. Then $E[\epsilon(x,\sigma)]_{p_{x}(x)}$ could be regarded as the 95% confidence interval length on the condition of noise power $\sigma^{2}$.

  6. If upper or lower bound of $\epsilon(x,\sigma)$ is reached at a special $x_0$. I can directly calculate. Howecer, even if x is fixed, $\epsilon$ is difficult to get. In this case, the thought of Chebyshev inequality may be usefull to estimate $\epsilon(x,\sigma)$:

    When x is fixed, the p.d.f of z fix. $$P(|z-y|<\frac{\epsilon}{2})=1-\int_{|z-y|\geq \frac{\epsilon}{2}}{p_{z}(z)dz}\geq 1-\int_{|z-y|\geq \frac{\epsilon}{2}}{\frac{4(z-y)^{2}}{\epsilon^{2}}p_{z}(z)dz}\geq 1-\int{\frac{4(z-y)^{2}}{\epsilon^{2}}p_{z}(z)dz}$$ $$=1-\frac{4}{\epsilon^{2}}\int{(z-y)^{2}p_{z}(z)dz}=0.95$$ i.e. $\epsilon=\sqrt{80\int{(z-y)^{2}p_{z}(z)dz}}$

  7. The main reason cause the high complexity of calculation above is that the distribution of $z$ is complex even if $|a_1|$ and $|a_2|$ are fixed. Suppose $t=\frac{s_1}{s_2}$, then $z=\frac{\sqrt{t}-1}{\sqrt{t}+1}$

    Random variable $t$ obey doubly non-central F-distribution (dnfd). The concrete expression p.d.f. of dnfd can be found on page 49 of Handbook on Statistical Distributions for Experimentalists(by Christian Walck,etc.). I think someone who is familiar with some special properties in dnfd may also help me a lot.