Solve for $\beta$. (Series)

62 Views Asked by At

I am proving the least squares estimates of the regression coefficients and I've come across these 2 equations.

$$\sum_{i=1}^{n}y_i=\alpha n+\beta \sum_{i=1}^{n}x_i$$

$$\sum_{i=1}^{n}y_ix_i=\alpha \sum_{i=1}^{n}x_i+\beta \sum_{i=1}^{n}x_i^2$$

I am supposed to solve what is $\beta$. The answer given is $$\beta=\frac{n(\sum_{i=1}^{n}x_iy_i)-(\sum_{i=1}^{n}x_i)(\sum_{i=1}^{n}y_i)}{n(\sum_{i=1}^{n}x_i^2)-(\sum_{i=1}^{n}x_i)^2}$$

I've tried many times to work it out by substitution method. But failed. It's tedious.

Hope someone can help me out. Thanks in advance.

6

There are 6 best solutions below

0
On BEST ANSWER

\begin{align*} \sum_{i=1}^{n}y_i &= \alpha n+\beta \sum_{i=1}^{n}x_i \\ \sum_{i=1}^{n}y_ix_i &= \alpha \sum_{i=1}^{n}x_i+\beta \sum_{i=1}^{n}x_i^2 \end{align*} Multiply the first by $\sum_{i=1}^{n}x_i$ and the second by $n$ to make the terms containing $\alpha$ match. \begin{align*} \sum_{i=1}^{n}y_i \sum_{i=1}^{n}x_i &= \alpha n \sum_{i=1}^{n}x_i + \beta \left( \sum_{i=1}^{n}x_i \right)^2 \\ n \sum_{i=1}^{n}y_ix_i &= \alpha n \sum_{i=1}^{n}x_i + \beta n \sum_{i=1}^{n}x_i^2 \end{align*} Subtract the first from the second, which cancels the $\alpha$ terms. $$ n \sum_{i=1}^{n}y_ix_i - \sum_{i=1}^{n}y_i \sum_{i=1}^{n}x_i = \beta n \sum_{i=1}^{n}x_i^2 - \beta \left( \sum_{i=1}^{n}x_i \right)^2 $$ Now factor out the common $\beta$ on the right-hand side and divide to isolate it. $$ n \sum_{i=1}^{n}y_ix_i - \sum_{i=1}^{n}y_i \sum_{i=1}^{n}x_i = \beta \left( n \sum_{i=1}^{n}x_i^2 - \left( \sum_{i=1}^{n}x_i \right)^2 \right) $$ and then $$ \frac{n \sum_{i=1}^{n}y_ix_i - \sum_{i=1}^{n}y_i \sum_{i=1}^{n}x_i}{n \sum_{i=1}^{n}x_i^2 - \left( \sum_{i=1}^{n}x_i \right)^2 } = \beta \text{.}$$

0
On

Your equation system is from the form $$A=\alpha n+\beta B$$ $$C=\alpha B+\beta D$$ from the first equation we get $$\alpha=\frac{C-\beta D}{B}$$ plugging this in the second equation: $$A=\frac{C-\beta D}{B}\cdot n+\beta B$$ multiplying by $B$: $$AB=Cn-\beta Dn+\beta B^2$$ so we have $$\beta=\frac{AB-Cn}{B^2-Dn}$$

0
On

Hint. You may divide each term of your two equalities by $n$, obtaining $$ \begin{cases} \alpha+\bar{x}\beta=\bar{y} \\ \bar{x}\alpha+\bar{x^2}\beta=\sigma_{xy}\end{cases} $$ which is a standard system of linear equations to solve.

Here we have just set $$ \bar{x}=\frac{\sum_{i=1}^{n}x_i}n, \quad \bar{y}=\frac{\sum_{i=1}^{n}y_i}n $$ and $$ \bar{x^2}=\frac{\sum_{i=1}^{n}x^2_i}n, \quad \sigma_{xy}=\frac{\sum_{i=1}^{n}x_iy_i}n. $$

0
On

It is because your keep the orginal notations.

Rewrite $$\sum_{i=1}^{n}y_i=\alpha n+\beta \sum_{i=1}^{n}x_i$$ $$\sum_{i=1}^{n}y_ix_i=\alpha \sum_{i=1}^{n}x_i+\beta \sum_{i=1}^{n}x_i^2$$ as $$S_1=n\alpha+S_2\beta$$ $$S_3=S_2\alpha+S_4\beta$$ Now, eliminate $\alpha$ from the first, replace in the second and solve it for $\beta$.

2
On

Use Cramer's Rule.

\begin{align} \Delta &= \begin{vmatrix} n & \sum_{i=1}^{n}x_i \\ \sum_{i=1}^{n}x_i & \sum_{i=1}^{n}x_i^2 \end{vmatrix} \\ \Delta_\alpha &= \begin{vmatrix} \sum_{i=1}^{n}y_i & \sum_{i=1}^{n}x_i \\ \sum_{i=1}^{n}x_iy_i & \sum_{i=1}^{n}x_i^2 \end{vmatrix} \\ \Delta_\beta &= \begin{vmatrix} n & \sum_{i=1}^{n}y_i \\ \sum_{i=1}^{n}x_i & \sum_{i=1}^{n}x_iy_i \end{vmatrix} \\ \alpha &= \frac{\Delta_\alpha}{\Delta} = \frac{(\sum_{i=1}^{n}y_i)(\sum_{i=1}^{n}x_i^2) - (\sum_{i=1}^{n}x_i)(\sum_{i=1}^{n}x_iy_i)}{n(\sum_{i=1}^{n}x_i^2) - (\sum_{i=1}^{n}x_i)^2} \\ \beta &= \frac{\Delta_\beta}{\Delta} = \frac{n(\sum_{i=1}^{n}x_iy_i) - (\sum_{i=1}^{n}x_i)(\sum_{i=1}^{n}y_i)}{n(\sum_{i=1}^{n}x_i^2) - (\sum_{i=1}^{n}x_i)^2} \end{align}

0
On

It's clearer to use the following notation: $$S_x=\sum_{i=1}^n x_i, \qquad S_y=\sum_{i=1}^n y_i,\qquad S_{xy}=\sum_{i=1}^n x_iy_i,\qquad S_{xx}=\sum_{i=1}^n x_i^2$$ Thus we have

$$\begin{align} S_y&=\alpha n\;\ +\beta S_x\tag{1}\\ S_{xy}&=\alpha S_x\ +\beta S_{xx}\tag{2}\\ \\ \alpha=\frac {S_y-\beta S_x}n&=\frac {S_{xy}-\beta S_{xx}}{S_x}\\\\ \beta(S_{xx}-S_x)&=nS_{xy}-S_xS_y\\\\ \beta&=\frac {nS_{xy}-S_xS_y}{S_{xx}-S_x}\end{align}$$ which expands to the required result.