Prove that for $b > 0$, $\iint_{\mathbb R^2}e^{-b({x^2}+{y^2})} dA = \frac{\pi}{b}$.

63 Views Asked by At

Using polar coordinates, prove that for $b > 0$,

$$\iint_{\mathbb R^2}e^{-b({x^2}+{y^2})} dA = \frac{\pi}{b}$$

Using $\frac{\pi}{b}$, also prove that

$$\int_{-\infty}^{\infty} e^{-b{x^2}} dx = \sqrt{\frac{\pi}{b}}$$

So I know that in polar coordinates, $x=r\cos\theta$ and $y=r\cos\theta$. However, without knowing specific bounds for $r$ and $\theta$, how would I go about proving this?

Thanks in advance.

1

There are 1 best solutions below

0
On BEST ANSWER

I think the way this goes is:

  1. Change to polar. $x^2 + y^2 = r^2$
  2. For an infinite plane, Limits become $0,2\pi$ for $\theta$ and $0,\infty$ for $r$
  3. $dA$ becomes $rdrd\theta$

Now the integral is solvable, I think.

For the second part write the the first integral as product of two integrals. As in

$$ \int_{-\infty}^{+\infty}\int_{-\infty}^{+\infty} e^{-(x^2+y^2)} dxdy= \int_{-\infty}^{+\infty}e^{-x^2}dx\int_{-\infty}^{+\infty}e^{-y^2}dy $$

Since variables of integration don't matter, change both $x$ and $y$ to r and there's your answer...