The context:Part of an assignment for a linear regression course, examining the relationship between plasma retinal levels of patients and various lifestyle factors.
The question: What happens, on average, to the plasma retinal levels if we increase the age? Does the size of the change depend on the age?
My Answer: Given the linear model $\hat{Y}=\hat{\beta_{0}}+\hat{\beta_{1}}(X),$ (Y=plasma retinal levels, X=age), then in general, increasing the age by n years gives
$$\hat{\beta_{0}}+\hat{\beta_{1}}·(X+n)=\hat{\beta_{0}}+\hat{\beta_{1}}·(X)+n\hat{\beta_{1}}=\hat{Y}+n\hat{\beta_{1}}.$$ Implying that the size of the change of the Plasma Retinal levels on average is indeed dependent on the age of the patients.
The feedback: The answer to the question is not correct, i.e. the size of the change is not dependent on the age of the patient (ask yourself why it's not... Hint: your interpretation of the linear model is not correct)
The plea for help: I have being racking my head trying to see what exactly is wrong with my model but do not see the problem? Any help greatly appreciated.
*EDIT:*Since posting I realised that it may be argued that since $\frac{dy}{dx}=\beta_{1}$ then the change is not dependent on an increase in age? Is this correct? I would nonetheless still appreciate clarification for what is wrong with my original answer.
The wording of your interpretation is not correct. Plasma retinal levels and age being positively correlated via a linear relationship is not the same as the amount of increase in retinal levels being dependent on age. That wouldn't be a linear relationship. That is, the slope of the graph doesn't change and your addition of n to the slope isn't correct.