For context, this query relates to the derivation of the distribution of the variance estimator in a linear regression. I’m doing this in a 3rd-year econometrics unit, and which takes a very matrix algebra heavy approach to all things linear regression, but given it is a unit ran by the economics department, they don’t really get into the linear algebra weeds. Seeking more clarity on this particular derivation, I’ve gone to the textbook the course is based on, and I understand the fully rigorous (or at least I think fully rigorous) argument presented there apart from one key bit. Let $I_n$ be the n-dimensional identity matrix and $X$ be an nxk dimensional matrix with rank k. $I_n-X(X^TX)^{-1}X^T$ is seen to be asymmetric and idempotent, and it has rank n-k. Thus, I understand by the spectral theorem there exists an orthogonal matrix $P$ such that $$P^T(I_n-X(X^TX)^{-1}X^T)P=\begin{pmatrix} \lambda_1 & 0 & ... & ... & ... & 0\\ 0 & ... & ... & ... & ... & ...\\ ... & ... & \lambda_{n-k} & ... & ... & ...\\ ... & ... & ... & 0 & ... & ... \\ ... & ... & ... & ... & ... & ... \\ 0 & ... & ... & ... & ... & 0 \end{pmatrix}$$ However, the book additionally asserts that there exists an orthogonal matrix $P$ such that $$P^T(I_n-X(X^TX)^{-1}X^T)P=\begin{pmatrix} 1 & 0 & ... & ... & ... & 0\\ 0 & ... & ... & ... & ... & ...\\ ... & ... & 1 & ... & ... & ...\\ ... & ... & ... & 0 & ... & ... \\ ... & ... & ... & ... & ... & ... \\ 0 & ... & ... & ... & ... & 0 \end{pmatrix}$$ where there are n-k 1's along the diagonal. I can only imagine it has something to do with the eigenstructure $I_n-X(X^TX)^{-1}X^T$, but I can't figure out why $I_n-X(X^TX)^{-1}X^T$ would only have 1 repeated eigenvalue which would permit this I think. Additionally, whilst trying to fill out the gaps of the derivation my econometrics lecturer gave, I came to a different result, and I can’t understand what wrong with me logic, which if true would mean the distribution that is ultimately being sought would be different then if the above result holds. We know that for some n-dimensional vectors $\underline{\hat{u}}$ and $\underline{u}$ $$\underline{\hat{u}}^T\underline{\hat{u}}=\underline{u}^T(I_n-X(X^TX)^{-1}X^T)\underline{u}$$. TheN I argue given that we know this is a scalar quantity $$\underline{\hat{u}}^T\underline{\hat{u}}=\text{trace}(\underline{\hat{u}}^T\underline{\hat{u}})=\text{tr}(\underline{u}^T(I_n-X(X^TX)^{-1}X^T)\underline{u})=\text{tr}(\underline{u}\underline{u}^T)(\text{tr}(I_n)-\text{tr}((X^TX)^{-1}X^TX))=\text{tr}(\underline{u}\underline{u}^T)(\text{tr}(I_n)-\text{tr}((I_k))=(n-k)\sum_{i=1}^n u_i^2$$ what's wrong with this manipulation of the trace?
2026-03-31 12:17:15.1774959435
diagonalising (I-X(X^TX)^-1X^T where X is rank n-k, a square matrix dimension nxn, into a matrix with n-k 1 along the diagonal, and then zeros.
144 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in LINEAR-ALGEBRA
- An underdetermined system derived for rotated coordinate system
- How to prove the following equality with matrix norm?
- Alternate basis for a subspace of $\mathcal P_3(\mathbb R)$?
- Why the derivative of $T(\gamma(s))$ is $T$ if this composition is not a linear transformation?
- Why is necessary ask $F$ to be infinite in order to obtain: $ f(v)=0$ for all $ f\in V^* \implies v=0 $
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- Summation in subsets
- $C=AB-BA$. If $CA=AC$, then $C$ is not invertible.
- Basis of span in $R^4$
- Prove if A is regular skew symmetric, I+A is regular (with obstacles)
Related Questions in DIAGONALIZATION
- Determining a $4\times4$ matrix knowing $3$ of its $4$ eigenvectors and eigenvalues
- Show that $A^m=I_n$ is diagonalizable
- Simultaneous diagonalization on more than two matrices
- Diagonalization and change of basis
- Is this $3 \times 3$ matrix diagonalizable?
- Matrix $A\in \mathbb{R}^{4\times4}$ has eigenvectors $\bf{u_1,u_2,u_3,u_4}$ satisfying $\bf{Au_1=5u_1,Au_2=9u_2}$ & $\bf{Au_3=20u_3}$. Find $A\bf{w}$.
- Block diagonalizing a Hermitian matrix
- undiagonizable matrix and annhilating polynom claims
- Show that if $\lambda$ is an eigenvalue of matrix $A$ and $B$, then it is an eigenvalue of $B^{-1}AB$
- Is a complex symmetric square matrix with zero diagonal diagonalizable?
Related Questions in TRACE
- How to show that extension of linear connection commutes with contraction.
- Basis-free proof of the fact that traceless linear maps are sums of commutators
- $\mathrm{tr}(AB)=\mathrm{tr}(BA)$ proof
- Similar 2x2 matrices of trace zero
- Basis of Image and kernel of Linear Transformation $\mathbb(M_{2,2})\rightarrow\mathbb(R^3) = (trace(A), 5*Trace(A), - Trace(A))$
- Replace $X$ with $\mbox{diag}(x)$ in trace matrix derivative identity
- Proving that a composition of bounded operator and trace class operator is trace class
- If $A \in \mathcal M_n(\mathbb C)$ is of finite order then $\vert \operatorname{tr}(A) \vert \le n$
- Characterisations of traces on $F(H)$
- "Symmetry of trace" passage in the proof of Chern Weil.
Related Questions in ECONOMICS
- Total savings from monthly deposits
- Calculus problem from a book of economics.
- a risk lover agent behave as if risk natural.
- Changes in the mean absolute difference (relating to the Gini coefficient)
- Absurd differential in first order condition
- FM Actuary question, comparing interest rate and Discount rate
- How do I solve for the selling price?
- Stochastic Dynamic Programming: Deriving the Steady-State for a Lottery
- A loan is to be repaid quarterly for five years that will start at the end of two years. If interest rate is $6$%..
- A cash loan is to be repaid by paying $13500$ quarterly for three years starting at the end of four years. If the interest rate is $12$%
Related Questions in LINEAR-REGRESSION
- Least Absolute Deviation (LAD) Line Fitting / Regression
- How does the probabilistic interpretation of least squares for linear regression works?
- A question regarding standardized regression coefficient in a regression model with more than one independent variable
- Product of elements of a linear regression
- Covariance of least squares parameter?
- Contradiction in simple linear regression formula
- Prove that a random error and the fitted value of y are independent
- Is this a Generalized Linear Model?
- The expected value of mean sum of square for the simple linear regression
- How to get bias-variance expression on linear regression with p parameters?
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Regarding the eigenvalues: note that $A = I_n - X(X^TX)^{-1}X$ is not only symmetric but also, as you've said, idempotent (i.e. $A^2 = A$). With that in mind: if $\lambda$ is an eigenvalue of $A$, then there must be an associated eigenvector $x$ (so that $x \neq 0$ and $Ax = \lambda x$), which means that $$ \lambda x = Ax = A^2x = A(Ax) = A(\lambda x) = \lambda Ax = \lambda^2 x. $$ So, we have $\lambda x = \lambda^2 x$. Because $x\neq 0$, it must be that $\lambda = \lambda^2$. This means that we must have $\lambda = 0$ or $\lambda = 1$.
The more general result is that if there is a polynomial $p(x)$ such that the matrix $A$ satisfies $p(A) = 0$, then it must hold that all eigenvalues of $A$ satisfy $p(\lambda) = 0$. In the case of an idempotent $A$, this holds with the polynomial $p(x) = x^2 - x$.
Regarding your manipulation of trace: it is correct to say that $$ \text{tr}(\underline{u}^T(I_n-X(X^TX)^{-1}X^T)\underline{u})=\text{tr}[(\underline{u}\underline{u}^T)(I_n-((X^TX)^{-1}X^TX))]. $$ From there, you seem to assume that $\operatorname{tr}(\underline{u}\underline{u}^T A) = \operatorname{tr}(\underline{u}\underline{u}^T) \operatorname{tr}(A)$ (where $A$ has its earlier definition), but this does not necessarily hold.