I have two regression models $$Y=X\beta+\varepsilon,\quad \beta\in\mathbb{R}^k$$ $$Y=Z\alpha+u\quad \alpha\in\mathbb{R}^m$$ it is known that using OLS estimates $\hat{\beta},\hat{\alpha}$ fitted values $\hat{Y}_x,\hat{Y}_z$ are orthogonal. I have to find estimates and fitted values of $$Y=X\beta'+Z\alpha'+v$$ I understand that answer is $\hat\alpha'=\hat\alpha$, $\hat\beta'=\hat\beta$, $\hat{Y}_{x+z}=\hat{Y}_x+\hat{Y}_z$, it is easy to check that in case $k=m=1$ vectors $X$ and $Z$ are then orthogonal themselves and result follows. But I face difficulties proving it in general.. $\hat{Y}_x^T\cdot\hat{Y}_z=(X\hat\beta)^TZ\hat\alpha=Y^T\Pi_X\Pi_YY=0$ where $\Pi_X=X(X^TX)^{-1}X^T$ is projector. Taking $A=(X\,Z)$ and trying to show that $$(A^TA)^{-1}AY=\begin{pmatrix}\hat{\beta} \\ \hat{\alpha} \end{pmatrix}=\hat{w}$$ I would need to prove $Z^TX\hat{\beta}=0$, $X^TZ\hat{\alpha}=0$ but how? I was given an idea about linear independence among $X$ and $Z$ columns but I can see it just as requirement for $\hat{w}$ to exist.
2026-04-05 06:35:01.1775370901
Orthogonal fitted values
200 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in MATRICES
- How to prove the following equality with matrix norm?
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- Powers of a simple matrix and Catalan numbers
- Gradient of Cost Function To Find Matrix Factorization
- Particular commutator matrix is strictly lower triangular, or at least annihilates last base vector
- Inverse of a triangular-by-block $3 \times 3$ matrix
- Form square matrix out of a non square matrix to calculate determinant
- Extending a linear action to monomials of higher degree
- Eiegenspectrum on subtracting a diagonal matrix
- For a $G$ a finite subgroup of $\mathbb{GL}_2(\mathbb{R})$ of rank $3$, show that $f^2 = \textrm{Id}$ for all $f \in G$
Related Questions in REGRESSION
- How do you calculate the horizontal asymptote for a declining exponential?
- Linear regression where the error is modified
- Statistics - regression, calculating variance
- Why does ANOVA (and related modeling) exist as a separate technique when we have regression?
- Gaussian Processes Regression with multiple input frequencies
- Convergence of linear regression coefficients
- The Linear Regression model is computed well only with uncorrelated variables
- How does the probabilistic interpretation of least squares for linear regression works?
- How to statistically estimate multiple linear coefficients?
- Ridge Regression in Hilbert Space (RKHS)
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
QUOTE
END OF QUOTE
I'm going to hazard to what you mean here: You're saying that the matrices $X$, $Z$ are such that regardless of the value of the vector $Y$, these two vectors of fitted values are orthogonal to each other.
If that's what you mean, then if would follow that every column of $X$ is orthogonal to every column of $Z$. Remember that the vector $\hat Y_x$ of fitted values is the orthogonal projection of $Y$ onto the column space of $X$. So every vector in the column space of $X$ is orthogonal to every vector in the column space of $Z$. It follows that $Z^T X=0$ and $X^T Z=0$, because every entry in the matrix product $Z^T X$ is the dot product of a row of $Z^T$ with a column of $X$, and thus is the dot product of a column of $Z$ with a column of $X$.