Prove this formula for $R^2$ in the case there is one explanatory variable in the OLS estimator

16 Views Asked by At

I'm learning OLS estimator with difficulty with computing the $R^2$. First are the notations used in my lecture note:

$X_{i} \equiv\left(\begin{array}{c}{X_{i 1}} \\ {X_{i 2}} \\ {\vdots} \\ {X_{i K}}\end{array}\right)$ of size $K \times 1$, $\beta \equiv\left(\begin{array}{c}{\beta_{1}} \\ {\beta_{2}} \\ {\vdots} \\ {\beta_{K}}\end{array}\right)$ of size $K \times 1$, and $\epsilon \equiv\left(\begin{array}{c}{\epsilon_{1}} \\ {\epsilon_{2}} \\ {\vdots} \\ {\epsilon_{n}}\end{array}\right)$ of size $n \times 1$.

$X \equiv\left(\begin{array}{cccc}{X_{11}} & {X_{12}} & {\cdots} & {X_{1 K}} \\ {X_{21}} & {X_{22}} & {\cdots} & {X_{2 K}} \\ {\vdots} & {\vdots} & {\vdots} & {\vdots} \\ {X_{n 1}} & {X_{n 2}} & {\cdots} & {X_{n K}}\end{array}\right)$ of size $n \times K$, and $Y \equiv\left(\begin{array}{c}{Y_{1}} \\ {Y_{2}} \\ {\vdots} \\ {Y_{n}}\end{array}\right)$ of size $n \times 1$

My model is $Y_{i}=X_{i}^{\prime} \beta+\epsilon_{i}$ for $i=1, \ldots, n$ or equivalently $Y=X \beta+\epsilon$. From FOC, we have $$\begin{aligned} \hat{\beta} &=\left(X^{\prime} X\right)^{-1} X^{\prime} Y \\ &=\left(\sum_{i=1}^{n} X_{i} X_{i}^{\prime}\right)^{-1}\left(\sum_{i=1}^{n} X_{i} Y_{i}\right) \\ &=\left(\frac{1}{n} \sum_{i=1}^{n} X_{i} X_{i}^{\prime}\right)^{-1}\left(\frac{1}{n} \sum_{i=1}^{n} X_{i} Y_{i}\right) \end{aligned}$$

Let $P_X = X\left(X^{\prime} X\right)^{-1} X^{\prime}$. Then the OLS fitted values $\hat{Y} \equiv X \hat{\beta}= P_XY$. The uncentered $R^2$ is $$R^2 = \frac{\hat{Y}^{\prime} \hat{Y}}{Y^{\prime} Y}$$

Then I have an exercise:

If $K = 1$ and $X_{i1} = 1$ for all $i =1,\dots,n$ then $$R^{2}=\frac{n \overline{Y}_{n}}{Y^{\prime} Y}$$ where $$\overline{Y}_{n} \equiv \frac{1}{n} \sum_{i=1}^{n} Y_{i}$$


My attempt:

We have

$$\hat{Y} = X \hat{\beta} = \left(\begin{array}{c}{1} \\ {1} \\ {\vdots} \\ {1}\end{array}\right) \left(\begin{array}{c}{\hat{\beta_1}}\end{array}\right) = \left(\begin{array}{c}{\hat{\beta_1}} \\ {\hat{\beta_1}} \\ {\vdots} \\ {\hat{\beta_1}}\end{array}\right)$$

As such, $\hat{Y}^{\prime} \hat{Y} = n (\hat{\beta_1})^2$. On the other hand, $n \overline{Y}_{n} = \sum_{i=1}^{n} Y_{i}$.


After that, I'm stuck. Could you please help me finish the proof? Thank you so much!