How to average the function below with a Gaussian distribution

108 Views Asked by At

I need to average the function $$ f(x) = \frac{J^2+2x^2+2x^2\cos[\sqrt{J^2+4x^2}]}{J^2+4x^2} $$ with a Gaussian probability distribution. In other words, I need to evaluate the integral $$ \int_{-\infty}^{\infty}\frac{1}{\sqrt{2\pi\sigma^2}}\exp(-\frac{x^2}{2\sigma^2})f(x)dx $$ Here $J$ is a constant, $\sigma^2$ is the variance of the Gaussian distribution. Any idea how to do this?

2

There are 2 best solutions below

12
On BEST ANSWER

Rewrite $f$ in the following way:

$$f(x) = f(x) = \frac{J^2+2x^2+2x^2\cos[\sqrt{J^2+4x^2}]}{J^2+4x^2} = 1 + \frac{2x^2(\cos[\sqrt{J^2+4x^2}]-1)}{J^2+4x^2}$$

The integral now becomes:

$$1 + \frac{1}{\sqrt{2\pi\sigma^2}}\int_{-\infty}^{\infty} \frac{2x^2(\cos[\sqrt{J^2+4x^2}]-1)}{J^2+4x^2} e^{-\frac{x^2}{2\sigma^2}} dx$$

since the integral of the full Gaussian is $1$. Now focusing on the integral left over, rewrite it as a series.

$$ = \sum_{n=1}^\infty \frac{(-1)^n}{(2n)!}\frac{1}{\sqrt{2\pi\sigma^2}}\int_{-\infty}^{\infty} 2x^2(J^2+4x^2)^{n-1} e^{-\frac{x^2}{2\sigma^2}} dx$$

$$ = \sum_{n=1}^\infty \sum_{k=0}^{n-1}\frac{(-1)^n}{(2n)!}\frac{1}{2\sqrt{2\pi\sigma^2}}{{n-1}\choose k}\int_{-\infty}^{\infty}J^{2n-2k-2}4^{k+1}x^{2k+2} e^{-\frac{x^2}{2\sigma^2}} dx$$

$$ = \sum_{n=1}^\infty \sum_{k=0}^{n-1}\frac{(-1)^n}{(2n)!}{{n-1}\choose k}J^{2n-2k-2}2^{2k+1}\sigma^{2k+2}(2k+1)!!$$

by the moment formulas for the normal distribution. Then swapping the order of the summations:

$$= 2\sum_{k=0}^\infty \sum_{n=k+1}^\infty\frac{(-1)^n}{(2n)!}{{n-1}\choose k}J^{2n-2k-2}\sigma^{2k+2}\frac{(2k+1)!}{k!}$$

$$= \sum_{k=0}^\infty \frac{(-1)^{k+1}{}_1F_2\left(k+1;k+\frac{3}{2},k+2;-\frac{J^2}{4}\right)}{(k+1)!}\sigma^{2k+2}$$

and it makes the final answer (adding the leftover term from earlier)

$$= \sum_{k=0}^\infty {}_1F_2\left(k;k+\frac{1}{2},k+1;-\frac{J^2}{4}\right)\frac{(-\sigma^2)^k}{k!}$$

which is as far as I could get.


$\mathbf{\text{EDIT}}$: Supposed $\frac{J^2}{4} \ll 1$. Then the hypergeometric goes to 1 and the summation becomes approximately $e^{-\sigma^2}$


$\mathbf{\text{EDIT}}$: Alternatively, Wolfram tells me I could have done the unswapped summation leading to the following alternate answer:

$$ 1 + \frac{J}{4\sigma}\sum_{n=1}^\infty U\left(\frac{3}{2},n+\frac{3}{2},\frac{J^2}{4\sigma^2}\right)\frac{(-J^2)^n}{(2n)!}$$

5
On

For the most general case, as Matti P. commented, just think about numerical integration.

For the case where $J$ is small, expand as a Taylor series $$f(x)=\frac{1}{2} \left(\cos \left(2 \sqrt{x^2}\right)+1\right)+\frac{ \left(-\sqrt{x^2} \sin \left(2 \sqrt{x^2}\right)-\cos \left(2 \sqrt{x^2}\right)+1\right)}{8 x^2}J^2+O\left(J^4\right)$$ Integrating termwise, this would give for the integral (without any simplification) $$I=a + b \,J^2+O\left(J^4\right)$$ where $$a=\frac{1}{2} \left(1+e^{-2 \sigma ^2}\right)$$ $$b=\frac{e^{-2 \sigma ^2} \left(e^{2 \sigma ^2} \left(\sqrt{2 \pi } \sigma \,\text{erf}\left(\sqrt{2} \sigma \right)-2\right)+2\right)}{16 {\sigma ^2}}$$

In fact, using a CAS for sure, I obtained results up to $O\left(J^{12}\right)$. I shall not type the results.

Edit

For the test case where $\sigma=1$ and $J=\frac 12$, the expansion to $O\left(J^{12}\right)$ gives $$\frac{e^2 \left(3084331196575 \sqrt{2 \pi } \text{erf}\left(\sqrt{2}\right)+91193306677248\right)+103594350299892}{194819716 546560 e^2}$$ which is $0.577933159393790$ while numerical integration gives $0.577933159393786$. Not too bad, I hope.

Update

Reworking the problem after Ninad Munshi's answer, I still think that, at least from a numerical point of view, the series solution could be a way to consider.

Writing $$\frac{2x^2(\cos[\sqrt{J^2+4x^2}]-1)}{J^2+4x^2}=\sum_{n=0}^\infty c_n(x)\, J^{2n}$$ makes the considered integral to be $$I=1 + \frac{1}{\sqrt{2\pi\sigma^2}}\sum_{n=0}^\infty K_n J^{2n}\qquad \text{where}\qquad K_n=\int_{-\infty}^{+\infty} c_n(x)\, e^{-\frac{x^2}{2\sigma^2}}\, dx $$ The computation of the $K_n$ does not make any major problem.

For any $\sigma$, the ratio $\frac{K_{n+1}}{K_n}$ is quite small and the convergence can be obtained at the price of adding quite few terms even around the maximum value $J=\frac \pi 2$.

For this value of $J=\frac \pi 2$, $I_{(p)}$ being the partial sum, here are some results for a few values of $\sigma$.

$$\left( \begin{array}{cccccccc} \sigma & I_{(0)} & I_{(1)}& I_{(2)}& I_{(3)}& I_{(4)}& I_{(5)} & I_{(6)} \\ 0.25 & 0.941248 & 0.953478 & 0.952462 & 0.952507 & 0.952506 & 0.952506 & 0.952506 \\ 0.50 & 0.803265 & 0.845634 & 0.842016 & 0.842180 & 0.842176 & 0.842176 & 0.842176 \\ 0.75 & 0.662326 & 0.738565 & 0.731778 & 0.732094 & 0.732085 & 0.732085 & 0.732085 \\ 1.00 & 0.567668 & 0.669949 & 0.660384 & 0.660846 & 0.660832 & 0.660833 & 0.660833 \\ 1.25 & 0.521968 & 0.638651 & 0.627186 & 0.627761 & 0.627744 & 0.627744 & 0.627744 \\ 1.50 & 0.505554 & 0.627006 & 0.61454 & 0.615188 & 0.615168 & 0.615168 & 0.615168 \\ 1.75 & 0.501094 & 0.621389 & 0.608608 & 0.609294 & 0.609272 & 0.609272 & 0.609272 \\ 2.00 & 0.500168 & 0.616352 & 0.603695 & 0.604392 & 0.604369 & 0.604370 & 0.604370 \\ 2.25 & 0.500020 & 0.610899 & 0.598612 & 0.599303 & 0.599280 & 0.599281 & 0.599281 \\ 2.50 & 0.500002 & 0.605275 & 0.593478 & 0.594152 & 0.594130 & 0.594130 & 0.594130 \\ 2.75 & 0.500000 & 0.599782 & 0.588521 & 0.589173 & 0.589151 & 0.589151 & 0.589151 \\ 3.00 & 0.500000 & 0.594582 & 0.583864 & 0.584490 & 0.584469 & 0.584469 & 0.584469 \\ 3.25 & 0.500000 & 0.589740 & 0.579548 & 0.580149 & 0.580128 & 0.580128 & 0.580128 \\ 3.50 & 0.500000 & 0.585266 & 0.575575 & 0.576150 & 0.576130 & 0.576130 & 0.576130 \\ 3.75 & 0.500000 & 0.581149 & 0.571926 & 0.572476 & 0.572457 & 0.572457 & 0.572457 \\ 4.00 & 0.500000 & 0.577362 & 0.568576 & 0.569102 & 0.569084 & 0.569084 & 0.569084 \end{array} \right)$$