What's the intuitive purpose of RMSE (root mean square error) compared to MAE (mean average error)?

444 Views Asked by At

When I want to find out what the average difference / error there is between two datasets, such as a predicted output vs. observed output of any system (i.e.: I predict output to be 100V, how does the actual measured output compare?) , intuitively I would do:

\begin{align} \frac{\sum_{i = 1}^{i = n} {|{P(i)-O(i)}|}}{n} \end{align}

where $P$ is a predicted value, $O$ is the observed value, for every instance $i$ up to $n$ instances. This would technically be defined as the MAE or mean average error.

However, I have seen another way to compare observed and predicted values using the RMSE, or root mean square error, defined as,

\begin{align} \sqrt{\frac{\sum_{i = 1}^{i = n} {\{{P(i)-O(i)}\}^2}}{n}} \end{align}

For someone who just wants a good idea of the average differences between two sets of data (in the case of a predicted output vs. observed output scenario), which method would be more useful, RMSE or MAE?