Suppose I am approximating the following system using ordinary least squares regression
$$y = p_0 + p_1 x_1 + p_2 x_2 + ... p_M x_M = \xi P$$
I know that a property of the least squares estimator is that the sum of the residuals, $r_i = \hat{y}_i - y_i$, is equal to zero. However, what I am finding is that the sum of the residuals multiplied by one of the regressors, $\xi_m \in (\mathbf{1}, x_1, x_2, ..., x_M)$, is also equal to zero. I have found this to be true in all cases in simulation, but I am not sure how to prove it.
$$\sum_{i=0}^{N} r_i \xi_m = 0$$
Intuitively it makes sense to me. For example, another property is that the residuals are normally distributed. The normal distribution is a continuous, even function. Let's denote it $f_N(x)$. Then, the integral $\int f_N(x)x = 0$ because you are integrating a (now) odd function over a symmetric interval. Thus $\Delta x\sum f_N(x) x =0$
Thanks!
Consider a regression model
$$y_i=X_i'\beta+\epsilon_i$$
The least squares problem is to minimize
$$\sum_{i=1}^{n}(y_i-X_i'\beta)^2$$
The first order condition for the optimal coefficient vector $\hat{\beta}$ is
$$\sum_{i=1}^{n}2(y_i-X_i'\hat{\beta})X_i=0$$
or
$$\sum_{i=1}^{n}\epsilon_iX_i=0$$
which says that the sample autocovariance between the residual and each regressor is zero.