I well know estimation measure is the so called minimum mean square error (MMSE) defined as: \begin{align} E[|W-\hat{W}(V)|^2] \end{align} where $W$ is a random variable (that we want to estimate) and random variable $V$ is the observation based on which we use estimator $\hat{W}(V)$ to infer what $W$ is.
MMSE is a well know quantity and received a lot of attention for the following reasons (not an exhaustive list):
it is easy to analyse analytically
MMSE is the $L^2$ norm and therefore it is a Hilbert space and estimator $\hat{W}$ is interpreted as a orthogonal projection
MMSE also has an interpretation of energy of the error
My questions are the following.
Do other estimation error \begin{align} E[|W-\hat{W}(V)|^p] \end{align}
for $p \in \mathbb{R}$ have
- some 'advantages'(open to interpretation) over MMSE?
- arise naturally in some applications?
- has anyone studies these higher order errors and for what purpose?
- any thing else that you can think of?
Please feel free to edit and improve the question.
Edit:
See the comment by A.S. where he notes that $p=1$ is also commonly used.
Also, $p=\infty$ which is the essential supremum, would correspond to the maximum possible error. It may or may not exists depending on what $X$ and $Y$ and $\hat{W}(Y)$.