So I was reading about linear regression in matrix terms, when I found incomprehensible transition (for me). When they sought for inverse of a $X^TX$ matrix: $$\begin{bmatrix} n & \sum_{i=1}^nx_i\\ \sum_{i=1}^nx_i & \sum_{i=1}^nx_i^2 \end{bmatrix}$$ which is $(X^TX)^{-1}$, they equated the determinant of $X^TX$: $$\det{(X^TX)} = n\sum_{i=1}^nx_i^2 - \sum_{i=1}^nx_i\cdot \sum_{i=1}^nx_i$$ to $$n\sum_{i=1}^{n}(x_i - \overline x)^2$$ How did they make such a transition? Please help me understand
2026-03-25 14:19:31.1774448371
On
linear regression matrix determinant
768 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
0
On
Note that $(1)$ $\sum (x_i - \bar{x} ) ^2 = \sum x_i ^2 - n \bar{x} ^ 2$ and $(2)$ $\sum x_i = n\bar{x}$. So, just plug it in, i.e., \begin{align} |X^TX| & = n \sum x_i^2 - \sum x_i \sum x_i \\ & = n \sum x_i^2 - n^2 \bar{x} ^ 2 \\ & = n(\sum x_i ^ 2- n \bar{x}^2) \\ & = n \sum (x_i - \bar{x} ) ^2. \end{align}
Note: $$\begin{align} n\sum (x-\bar{x})^2=&n\sum x^2 -n\sum 2\bar {x}\cdot x +n\sum \bar{x}^2=\\ &n\sum x^2 -2n\bar{x}\sum x+n^2\bar{x}^2=\\ &n\sum x^2-2n\cdot \frac{\sum x}{n}\cdot \sum x+n^2\cdot \left(\frac{\sum x}{n}\right)^2=\\ &n\sum x^2-2\sum x\cdot \sum x+\sum x\cdot \sum x=\\ &n\sum x^2-\sum x\cdot \sum x. \end{align}$$