Say I have a set of points Y and I want to accuratly predict the values of Y by using three variables X1,X2,X3. Hence my equation is
Y=intercept + C1*X1 + C2*X2 + C3*X3
After performing linear regression I get my values of C1,C2,C3 and the intercept. For every value of X1,X2,X3 I have a new point Y' that is supposed to be as close as possible to Y.
Now my question is, are the sum of values in Y and Y' supposed to be equal?
When the analysis is performed using ordinary least regression, the sum of predicted Y values and that of observed Y values are equal in all models that include an intercept. However, this is not necessarily true for models that do not include an intercept.
To show it, first we have to note that the difference between the sum of predicted Y and that of observed Y is equal to the sum of the absolute differences or residuals. Considering for simplicity univariable regression (but the following considerations can also be applied to multivariable models such as that cited in the question), the regression parameters A (slope) and B (intercept) in OLS are estimated so that the sum of squared residuals is minimized. Since the regression model is $Y_i = aX_i + b+\epsilon_i$, where $\epsilon_i$ represents the residual in the estimation of $Y_i$, we get that the sum of squared residuals is
$$\sum_{i=1}^n(Y_i - aX_i - b)^2$$
To minimize this function, it is necessary that the first partial derivatives with respect to $a$ and $b$ are equal to zero when assessed in $A$ and $ B$. Calculating the first partial derivative with respect to $a$ and equalizing its value in $A$ and $B$ to zero, we get
$$ -2\sum_{i=1}^n(Y_i - AX_i - B) = 0 $$
Since $(Y_i - AX_i - B) $ is the residual, it follows that
$$ \sum_{i=1}^n\epsilon_i = 0 $$ Therefore, because the sum of residuals is zero, the sum of predicted $Y$ values has to be equal to that of observed $Y$ values. It is also evident that this is not necessarily true in models where the constant term is lacking.
Just an example to better illustrate the issue. Suppose that in a sample of three measurements of a single independent variable $X$ we get that the target variable $Y$ assumes the following $(X,Y)$ values: $(1,2), (2,4), (3,8)$. OLS regression yields the regression line $Y=3X-4/3$, which provides a sum of squared errors of $2/3$. The predicted values of $Y$ for $X$ equal to 1, 2, and 3 are $5/3$, $14/3$, and $23/3$. Their sum is $42/3=14$, which is exactly the sum of the three observed $Y$ values $2+4+8=14$.
Now let us build another linear model with no intercept using the same data. The OLS procedure leads to the regression line $17/7\,X$. The predicted values of $Y$ for $X$ equal to 1, 2, and 3 are $17/7$, $34/7$, and $51/7$. Their sum is $102/7=14.6$, which is higher than the sum of the three observed $Y$ values $2+4+8=14$.