Let $X$ be a random variable with expectation value $\mathbb{E}(X)=\mu$.
Is there a (reasonably standard) notation to denote the "centered" random variable $X - \mu$?
And, while I'm at it, if $X_i$ is a random variable, $\forall\,i \in \mathbf{n} \equiv \{0,\dots,n-1\}$, and if $\overline{X} = \frac{1}{n}\sum_{i\in\mathbf{n}} X_i$, is there a notation for the random variable $X_i - \overline{X}$? (This second question is "secondary". Feel free to disregard it.)
In some contexts, $\varepsilon_i = X_i-\mu$ is called an "error" and $\hat\varepsilon_i=X_i-\bar X$ is called a "residual".
Notice that if $X_1,\ldots,X_n$ are i.i.d. then the errors $\varepsilon_i=X_i-\mu$ are independent and the residuals $\hat\varepsilon_i=X_i-\bar X$ are not (since they are constrained to add up to $0$, so they are negatively correlated).
In the simple linear regression problem in which $\mathbb E(X_i) = \alpha+\beta w_i$, the errors $\varepsilon_i = X_i - (\alpha+\beta w_i)$ are often taken to be independent, but the residuals $\hat\varepsilon_i=X_i - (\hat\alpha+\hat\beta w_i)$, where $\hat\alpha$ and $\hat\beta$ are least-squares estimates that depend on $X_i$ and $w_i$, $i=1,\ldots, n$, are constrained to satisfy the two equalities $\hat\varepsilon_1+\cdots+\hat\varepsilon_n=0$ and $\hat\varepsilon_1 w_1+\cdots+\hat\varepsilon_n w_n=0$. The correlation between $\hat\varepsilon_i$ and $\hat\varepsilon_j$ depends on $w_1,\ldots,w_n$ and on $i$ and $j$. (Specifically, if the errors all have the same variance---an assumption called homscedasticity---and are uncorrelated, and every entry in the first column of $M\in\mathbb R^{n\times 2}$ is $1$ and the second column is $w_1,\ldots,w_n$, then $\operatorname{var}(\varepsilon)(I_n-M(M^TM)^{-1}M^T)$ is the matrix whose $i,j$ entry is $\operatorname{cov}(\hat\varepsilon_i,\hat\varepsilon_j)$.)
See this article: http://en.wikipedia.org/wiki/Errors_and_residuals_in_statistics