I read on Wikipedia that Laplace was the first to evaluate
$$\int\nolimits_{-\infty}^\infty e^{-x^2} \, \mathrm dx$$
Does anybody know what he was doing that lead him to that integral? Even better, can someone pose a natural problem that would lead to this integral?
Edit: Many of the answers make a connection to the normal distribution, but then the question now becomes: Where does the density function of the normal distribution come from? Mike Spivey's answer is in the spirit of what I am looking for: an explanation that a calculus student might understand.
You asked about a natural problem that leads to this integral. Here's a summary of the argument I give in my undergraduate probability theory class. (It's due to Dan Teague; he has the article here.)
Imagine throwing a dart at the origin in the plane. You're aiming at the origin, but there is some variability in your throws. The following assumptions perhaps seem reasonable.
Let the probability of landing in a thin vertical strip from $x$ to $\Delta x$ be $p(x) \Delta x$. Similarly, let the probability of landing in a short horizontal strip from $y$ to $\Delta y$ be $p(y) \Delta y$. So the probability of the dart landing in the intersection of the two strips is $p(x) p(y) \Delta x \Delta y$. Since the orientation doesn't matter, any similar region $r$ units away from the origin has the same probability, and so we could express this in polar as $p(r) \Delta x \Delta y$; i.e., $p(r) = p(x) p(y)$.
Differentiating both sides of $p(r) = p(x) p(y)$ with respect to $\theta$ yields $0 = p(x) \frac{dp(y)}{d \theta} + p(y) \frac{dp(x)}{d \theta}$. Using $x = r \cos \theta$, $y = r \sin \theta$, simplifying, and separating variables produces the differential equation $$\frac{p'(x)}{x p(x)} = \frac{p'(y)}{y p(y)}.$$
Now, we assumed that $x$ and $y$ are independent, yet this differential equation holds for any $x$ and $y$. This is only possible if, for some constant $C$, $$\frac{p'(x)}{x p(x)} = \frac{p'(y)}{y p(y)} = C.$$ Solving the $x$ version of this differential equation yields $$\frac{dp}{p} = Cx \, dx \Rightarrow \ln p = \frac{Cx^2}{2} + c \Rightarrow p(x) = Ae^{Cx^2/2}.$$ Finally, since large errors are less likely than small errors $C$ must be negative. So we have $$p(x) = A e^{-kx^2/2}.$$ Since $p(x)$ is a probability density function, $$\int_{-\infty}^{\infty} A e^{-kx^2/2} dx = 1,$$ which is just a scaled version of your original integral.
(A little more work shows that $A = \sqrt{k/2\pi}$. Also, if you think about it some, it makes sense that $k$ should be inversely related to the variability in your throwing. And for the normal pdf, we do in fact have $k = 1/\sigma^2$.)