A linear regression model may be written either:
$Y_i$ = $\beta_0$ + $\beta_1X_i$ + $\epsilon_i$
Or
$Y_i$ = $\alpha_0$ + $\alpha_1(X_i + \bar x)$ + $\epsilon_i$
Use the method of least square to estimate $\hat \alpha_0$ and $\hat \alpha_1$
I am getting $\hat \alpha_0$ is equal to $\bar Y$ which seems wrong.
Any help appreciated thanks.
To start you differentiate $S=\sum_{i=1}^n \left(y_i-a_0-a_1x_i-a_1\overline x \right)^2$ w.r.t. $a_1$.
$\frac{\partial S}{\partial a_1}=2\cdot \sum_{i=1}^n \left(y_i-a_0-a_1x_i-a_1\overline x \right)\cdot (x_i-\overline x)=0$
Multipliying out the brackets and drop 2
$\sum_{i=1}^n x_iy_i-\overline x \sum_{i=1}^n y_i-a_0\sum_{i=1}^n x_i+a_0\overline x \sum_{i=1}^n 1-a_1\sum_{i=1}^n x_i^2 + \overline x\cdot a_1 \sum_{i=1}^n x_i-a_1 \overline x \sum_{i=1}^n +a_1 \overline x ^2 \sum_{i=1}^n 1=0$
Isolation of $a_1$ and $\sum_{i=1}^n x_i=n\cdot \overline x$ and $\sum_{i=1}^n y_i=n\cdot \overline y$
$a_1 \cdot \left[ \overline x \sum_{i=1}^n x_i+n\cdot \overline x ^2- \overline x \cdot n \cdot \overline x- \sum_{i=1}^n x_i ^2 \right]+\sum_{i=1}^n y_ix_i-\overline x \sum_{i=1}^n y_i=0$
Now solve for $a_1$. $a_0$ has disappeared.
Differentiating $S$ w.r.t $a_0$
$\frac{\partial S}{\partial a_0}=2\cdot \sum_{i=1}^n \left(y_i-a_0-a_1x_i-a_1\overline x \right)\cdot (-1)=0$