Using polar coordinates, prove that for $b > 0$,
$$\iint_{\mathbb R^2}e^{-b({x^2}+{y^2})} dA = \frac{\pi}{b}$$
Using $\frac{\pi}{b}$, also prove that
$$\int_{-\infty}^{\infty} e^{-b{x^2}} dx = \sqrt{\frac{\pi}{b}}$$
So I know that in polar coordinates, $x=r\cos\theta$ and $y=r\cos\theta$. However, without knowing specific bounds for $r$ and $\theta$, how would I go about proving this?
Thanks in advance.
I think the way this goes is:
Now the integral is solvable, I think.
For the second part write the the first integral as product of two integrals. As in
$$ \int_{-\infty}^{+\infty}\int_{-\infty}^{+\infty} e^{-(x^2+y^2)} dxdy= \int_{-\infty}^{+\infty}e^{-x^2}dx\int_{-\infty}^{+\infty}e^{-y^2}dy $$
Since variables of integration don't matter, change both $x$ and $y$ to r and there's your answer...