I have read and I think understood the fact that in linear regression, the predicted response is the mean value (expected value) of y given (conditioned on) a particular regressor value $x_d$.
However there is an additional claim, and I would like to understand how these two are related: That the regression coefficients $\beta$ represent a mean change in the response variable given a unit change of the particular regressor. So, are these statements related?
Many thanks
In linear regression, we assume that $\mathbb{E}[y \mid \mathbf{x}]$, where $\mathbf{x}$ is a given, known vector of predictors $$\mathbf{x}^{T} = \begin{bmatrix} 1 & x_1 & x_2 &\cdots &x_p \end{bmatrix}$$is given by the form $$\mathbb{E}[y \mid \mathbf{x}] = \mathbf{x}^{T}\boldsymbol\beta=\beta_0+\beta_1x_1+\beta_2x_2+\cdots\beta_px_p\text{.}$$ For any $i = 1, \dots, p$, notice that $$\dfrac{\partial \mathbb{E}[y \mid \mathbf{x}]}{\partial x_i} = \beta_i\text{.}$$ Recall the intuition of partial derivatives: holding all other variables constant, it is the "slope" of the given function. Since the slope is $\beta_i$, what this indicates is that for each unit increase in $x_i$, the conditional expected value (or mean) would change by $\beta_i$.
(Of course, this assumes that this model for $\mathbb{E}[y \mid \mathbf{x}]$ is actually correct.)