I am learning statistics recently. By far, many statistical tests I saw, e.g. F-test, ANOVA, uses variance as their components. Hardly can I find any statistical test that uses third moment or even higher moments. It seems to me that statistics has a large favor in the second moment. In linear regression, we consider the sum of 'squared' residuals. In Ridge regression, we use $L^2$ norm to penalize the estimated coefficients.
Why is the second moment (or 'squared' or '$L^2$' norm) so ubiquitous in the statistical world?
The second moment is always non-negative, so one can optimize it (by making it small). Can't do that with the third moment.
Also the second moment admits simple derivatives (leading to linear equations).
The second moment is simpler to compute than higher-orders.