In Casella & Berger, the derivation of Bias-Variance tradeoff assumes that the error in parameter inference is measured with Mean Squared Error. Given an estimator $W$ of $\theta$, MSE can be decomposed as:
$E_\theta\big[(W - \theta)^2\big]= Var_\theta(W) + \big(E_\theta W - \theta\big)^2$
where the first term is the variance of the estimator and the right term is the bias.
The same presentation is pretty much given in any book, e.g. Elements of Statistical Learning. Except that the error term also appears.
But what if the assumption of MSE is violated? Can any type of error be decomposed into some function of (Variance, Bias)?
In particular, what about Mean Absolute Error or Median absolute error?
If not, then is it "illegal" to talk about Bias-Variance tradeoff with the loss function not corresponding to MSE?