Possible Duplicate:
Motivation behind standard deviation?
In statistics very often you see something of the sort: $$ \textrm{quantity}=\sqrt{\frac {\sum(x-\mu)^2} {N}} $$ to measure things like standard deviation ($\mu$ is the mean here).
It seems that just making an absolute value of the difference will give us a pretty good measure of the same thing: $$ \textrm{quantity}=\frac {\sum{\Bigl|x-\mu\Bigr|}} {N} $$
How did we end up with those squares?
Simply because it is easier to work with analytically.
(One might ask similar questions about why we use least squares instead of least fourth powers.)
Note that when the foundations of statistics and probability were laid, there were not computers (in the modern sense). However, some people are now using absolute value approaches in lieu of squares/square roots because we are finally able to do so with modern computing power.