Let $V$ be a finite dimensional Euclidean space and let $W_1,W_2$ be two subspaces of $V$. Let $P_1,P_2$ denote the projections onto $W_1,W_2$ respectively. Is it true that the composition $P_1\circ P_2$ is always diagonalizable?
2026-04-02 21:47:31.1775166451
Composition of two orthogonal projections
1.6k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in LINEAR-ALGEBRA
- An underdetermined system derived for rotated coordinate system
- How to prove the following equality with matrix norm?
- Alternate basis for a subspace of $\mathcal P_3(\mathbb R)$?
- Why the derivative of $T(\gamma(s))$ is $T$ if this composition is not a linear transformation?
- Why is necessary ask $F$ to be infinite in order to obtain: $ f(v)=0$ for all $ f\in V^* \implies v=0 $
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- Summation in subsets
- $C=AB-BA$. If $CA=AC$, then $C$ is not invertible.
- Basis of span in $R^4$
- Prove if A is regular skew symmetric, I+A is regular (with obstacles)
Related Questions in HILBERT-SPACES
- $\| (I-T)^{-1}|_{\ker(I-T)^\perp} \| \geq 1$ for all compact operator $T$ in an infinite dimensional Hilbert space
- hyponormal operators
- a positive matrix of operators
- If $S=(S_1,S_2)$ hyponormal, why $S_1$ and $S_2$ are hyponormal?
- Is the cartesian product of two Hilbert spaces a Hilbert space?
- Show that $ Tf $ is continuous and measurable on a Hilbert space $H=L_2((0,\infty))$
- Kernel functions for vectors in discrete spaces
- The space $D(A^\infty)$
- Show that $Tf$ is well-defined and is continious
- construction of a sequence in a complex Hilbert space which fulfills some specific properties
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Orthogonal projections are Hermitian positive semi-definite (with eigenvalues $0$ and/or $1$). The product of two positive semi-definite matrices (operators) is diagonalisable (see, e.g., Corollary 7.6.2 here -- unfortunately, it is not part of the preview). Therefore, the product of two orthogonal projections is diagonalisable.
The proof is based on the fact that given two positive semi-definite matrices $A$ and $B$ such that $r=\mathrm{rank}(A)$, there is a nonsingular matrix $X$ such that $$\tag{$*$} A=X\begin{bmatrix}I_r & 0 \\ 0 & 0\end{bmatrix}X^*, \quad B=X^{-*}DX^{-1}, $$ where $D$ is diagonal (and has nonnegative diagonal entries). Then it follows that $$ AB=X\left(\underbrace{\begin{bmatrix}I_r & 0 \\ 0 & 0\end{bmatrix}D}_{\text{diagonal matrix}}\right)X^{-1}. $$
The proof of ($*$) is not hard but rather technical.
Case $A$ is definite: It is easy to show that $AB$ is diagonalisable if $A$ and $B$ are Hermitian positive semi-definite and at least one of them (say, $A$) is positive definite. To show that we can consider the Cholesky factorisation of $A$: $A=LL^*$ for some nonsingular triangular matrix $L$. Then, since $L^*BL$ is positive semi-definite, we can unitarily diagonalise it as $$ L^*BL=UDU^*, $$ where $U$ is unitary and $D$ has non-negative diagonal. Now set $X=LU$ and see that $$\tag{1} A=LL^*=LUU^*L=XX^*, \quad B=L^{-*}UDU^*L^{-1}=X^{-*}DX^{-1}. $$
Both $A$ and $B$ semi-definite: The case that both $A$ and $B$ are semi-definite is a bit more complicated but proceeds essentially in the same fashion. Assume that $r=\mathrm{rank}(A)$. There exists a nonsingular matrix $L$ (does not need to be triangular) such that $$\tag{2} A=L\begin{bmatrix}I_r&0\\0&0\end{bmatrix}L^*. $$ This decomposition can be obtained, e.g., using the eigen-decomposition of $A$ and by "absorbing" the diagonal matrix containing the eigenvalues of $A$ to the unitary factors. Note that we cannot proceed directly as in the previous case because for a unitary $U$, we do not have in general $$ U\begin{bmatrix}I_r&0\\0&0\end{bmatrix}U^*\neq \begin{bmatrix}I_r&0\\0&0\end{bmatrix}, $$ which was used to show that $A=XX^*$ before; see (1). So we must find a bit more fancy diagonalisation of $L^*BL$ which will leave the diagonal matrix in (2) untouched by multiplication from both sides.
So, consider the matrix $$ L^*BL=\left[\begin{array}{cc}B_{11}&B_{12}\\B_{21}&B_{22}\end{array}\right]\begin{array}{l}\}r\\\}n-r\end{array}. $$ We can use the fact that (since $L^*BL$ is positive semi-definite) $\mathcal{N}(B_{11})\subset\mathcal{N}(B_{12}^*)$ (which is easy to show using the fact that if $x^*Cx=0$ for a semi-definite $C$ and a vector $x$, then $Cx=0$; it is in fact a generalisation of the property of semi-definite matrices which states that a zero on the diagonal implies zeros in the corresponding row and column), which implies (due to the relation between fundamental spaces) that $\mathcal{R}(B_{12})\subset \mathcal{R}(B_{11})$, and hence there is a matrix $Z\in\mathbb{C}^{r\times(n-r)}$ such that $B_{12}=B_{11}Z$. It follows that $$ L^*BL=\begin{bmatrix}B_{11}&B_{11}Z\\Z^*B_{11}&B_{22}\end{bmatrix} =\begin{bmatrix}I&0\\Z^*&I\end{bmatrix}\begin{bmatrix}B_{11}&0\\0&B_{22}-Z^*B_{11}Z\end{bmatrix}\begin{bmatrix}I&Z\\0&I\end{bmatrix}. $$ Now both diagonal blocks on the right-hand side are semi-definite (since we have a congruent factorisation) so there are unitary matrices $U_1$ and $U_2$ and diagonal positive semi-definite matrices $D_1$ and $D_2$ such that $$ B_{11}=U_1D_1U_1^*, \quad B_{22}-Z^*B_{11}Z=U_2D_2U_2^*. $$ The transformation matrix $X$ we look for is then given by $$ X=L\begin{bmatrix}I&-Z\\0&I\end{bmatrix}\begin{bmatrix}U_1&0\\0&U_2\end{bmatrix}. $$ With $D=\mathrm{diag}(D_1,D_2)$, it is easy to see that $B=X^{-*}DX^{-1}$. Indeed, $$ \begin{split} B &= L^{-*} \begin{bmatrix} I & 0 \\ Z^* & I \end{bmatrix} \begin{bmatrix} B_{11} & 0 \\ 0 & B_{22}-Z^*B_{11}Z \end{bmatrix} \begin{bmatrix} I & Z \\ 0 & I \end{bmatrix} L^{-1}\\ &= \underbrace{L^{-*} \begin{bmatrix} I & 0 \\ Z^* & I \end{bmatrix} \begin{bmatrix} U_1 & 0 \\ 0 & U_2 \end{bmatrix}}_{X^{-*}} \begin{bmatrix} D_1 & 0 \\ 0 & D_2 \end{bmatrix} \underbrace{\begin{bmatrix} U_1 & 0 \\ 0 & U_2 \end{bmatrix}^* \begin{bmatrix} I & Z \\ 0 & I \end{bmatrix} L^{-1}}_{X^{-1}}\\ &=X^{-*}DX^{-1}. \end{split} $$ Also, $$ \begin{split} X\begin{bmatrix}I_r&0\\0&0\end{bmatrix}X^* &= L\begin{bmatrix}I&-Z\\0&I\end{bmatrix}\begin{bmatrix}U_1&0\\0&U_2\end{bmatrix} \begin{bmatrix}I_r&0\\0&0\end{bmatrix} \begin{bmatrix}U_1&0\\0&U_2\end{bmatrix}\begin{bmatrix}I&0\\-Z^*&I\end{bmatrix}L^*\\ &= L \begin{bmatrix}I_r&0\\0&0\end{bmatrix} L^*\\ &=A. \end{split} $$
Note that the product of two oblique projections need not to be diagonalisable, since for $$ P_1 = \begin{bmatrix} 1 & 1 \\ 0 & 0 \end{bmatrix} \quad\text{and}\quad P_2 = \begin{bmatrix} 0 &1 \\ 0&1 \end{bmatrix}, $$ the product $$ P_1P_2=\begin{bmatrix} 0 & 2 \\ 0 & 0 \end{bmatrix} $$ cannot be diagonalised.