How to prove that non-diagonal elements of hat matrix (from regression) are limited?

1.2k Views Asked by At

I want to prove this inequality: $$h_{ij}^2 \le 0.25$$ where $h_{ij}$ is an element of hat matrix $H = X(X'X)^{-1}X'$ from multiple linear regression model ($ Y = X\beta + \epsilon$, $X$ is a $n\times p$ matrix). I know that for diagonal elements we have: $$ 0 \le h_{ii} \le 1$$ and from fact that $H^2 = H$ and $H$ is symmetric we can write $$ h_{ii} = h_{ii}^2 + \sum_{i \neq j}{h_{ij}^2},$$ but still I don't know how to prove it. Maybe I don't remember a particular theorem from algebra that can simply help or I can't link some facts from regression still ( I'm new in this topic). Thanks for taking your time!

1

There are 1 best solutions below

0
On BEST ANSWER

From matrix multiplication ($H^2 = H$), you can write $$h_{ii}=\sum_{j=1}^n h_{ji}^2$$ for every $i$ in $1,\ldots, n$.

Next, we have $$h_{ii}=h_{ii}^2 + \sum_{j=1, j\neq i}^n h_{ji}^2$$ and then, $$h_{ii}-h_{ii}^2= \sum_{j=1, j\neq i}^n h_{ji}^2$$ Function $f(h_{ii}) = h_{ii}-h_{ii}^2$ has a local maximum in $h_{ii}=0.5$, and f(0.5) = 0.25. Thus, $$\sum_{j=1, j\neq i}^n h_{ji}^2=h_{ii}-h_{ii}^2\leq0.25.$$ So every $h_{ji}\leq 0.25 $.