How to prove $\mathbf{1}^\top\mathbf{Q}^+\mathbf{Q}=\mathbf{1}^\top$, where $\mathbf{Q}$ is any element-wise squared correlation matrix?

313 Views Asked by At

Let $(X_1,…,X_n)$ be a random vector with $0<\prod_{j=1}^n\text{Var}(X_j)<∞$.

Let $\mathbf{Q}=(\mathbf{q}_{1},…,\mathbf{q}_{n})=(ρ_{jk}^2)_{n×n}$, where $ρ_{jk}$ is the Pearson correlation coefficient between $X_j$ and $X_k$. How to prove or disprove

$$\mathbf{1}^\top\mathbf{Q}^+\mathbf{Q}=\mathbf{1}^\top$$

where $\mathbf{Q}^+$ is the Moore-Penrose inverse of $\mathbf{Q}$ and $\mathbf{1}^\top$ is a row vector of ones?

$$$$

It seems that when any row or column vectors in $\mathbf{Q}$ are linearly dependent, they must be equal. If it is true, let $\mathbf{H}_{n×n}=(\mathbf{h}_1,…,\mathbf{h}_r,\mathbf{0},…,\mathbf{0})^\top$ be the reduced row echelon form of $\mathbf{Q}$, where $r=\text{rank}(\mathbf{Q})$; and there is $\mathbf{h}_{j}^\top\mathbf{h}_{k}=0$ for all $j\neq k$.

Then, let the column indices of the leading ones in each nonzero rows of $\mathbf{H}$ be $j_1,…,j_r$, and let

$$\mathbf{F}_{n×r}=(\mathbf{q}_{j_1},...,\mathbf{q}_{j_r}),\; \mathbf{G}_{r×n}=(\mathbf{h}_1,...,\mathbf{h}_r)^\top$$

According to rank factorization from reduced row echelon forms, we have $\mathbf{Q}=\mathbf{FG}$

According to construction of Moore–Penrose inverse by rank decomposition, we have

$$\mathbf{Q}^+=\mathbf{G}^\top(\mathbf{GG^\top})^{-1}(\mathbf{F^\top F})^{-1}\mathbf{F}^\top$$

Thus, \begin{equation} \begin{split} & \mathbf{1}^\top\mathbf{Q}^+\mathbf{Q} = \mathbf{1}^\top\mathbf{G}^\top(\mathbf{GG^\top})^{-1}\mathbf{G} \\ & = \mathbf{1}^\top \begin{bmatrix} \mathbf{h}_1 & \cdots & \mathbf{h}_r \end{bmatrix} \begin{bmatrix} \mathbf{h}_1^\top\mathbf{h}_1 & \cdots & \mathbf{h}_1^\top\mathbf{h}_r \\ \vdots & \ddots & \vdots \\ \mathbf{h}_r^\top\mathbf{h}_1 & \cdots & \mathbf{h}_r^\top\mathbf{h}_r \\ \end{bmatrix}^{-1} \begin{bmatrix} \mathbf{h}_1^\top \\ \vdots \\ \mathbf{h}_r^\top \\ \end{bmatrix} \\ & = \mathbf{1}^\top\sum_{i=0}^r (\mathbf{h}_i^\top\mathbf{h}_i)^{-1} \mathbf{h}_i \mathbf{h}_i^\top = \mathbf{1}^\top \end{split} \end{equation}

where $\mathbf{h}_i^\top\mathbf{h}_i$ is the number of ones in $\mathbf{h}_i$ and $\mathbf{h}_i\mathbf{h}_i^\top$ is an $n×n$ block diagonal matrix with main-diagonal blocks of either ones or zeros.

Therefore, the question may turn into how to prove or disprove that, when any rows in $\mathbf{Q}$ are linearly dependent, they must be the same.

3

There are 3 best solutions below

5
On BEST ANSWER

Schur product theorem tells that $\mathbf{Q}$ is positive semi-definite. Then by the property of Moore–Penrose pseudoinverse, $\mathbf{Q}^{+}\mathbf{Q} = \mathbf{Q}\mathbf{Q}^{+}$ is the orthogonal projection onto the range of $\mathbf{Q}$, yielding the following equivalence:

\begin{align*} \mathbf{1}^{\top}\mathbf{Q}^{+}\mathbf{Q} = \mathbf{1}^{\top} &\quad\iff\quad \mathbf{1} \in \operatorname{im}\mathbf{Q} = (\ker \mathbf{Q})^{\perp} \\ &\quad\iff\quad (\ker\mathbf{Q}) \perp \mathbf{1} \end{align*}

We will establish the last relation using a probabilistic argument.


Proof. Replacing each $X_i$ by its standardization if necessary, we may assume $\mathsf{Var}(X_i) = 1$ for each $i$. Let $\mathbf{\Sigma}_X$ be the covariance matrix of $X$. Also, define the random vectors $\tilde{X}$ and $Y$ by

$$ \tilde{X} \sim \mathcal{N}(\mathbf{0}, \mathbf{\Sigma}_X) \qquad\text{and}\qquad Y = \tilde{X}^{\circ 2} = (\tilde{X}_1^2, \ldots, \tilde{X}_n^2)^{\top}, $$

where $\circ$ denotes Hadamard/entrywise product. Then, as in the proof of Schur product theorem, we know that the covariance matrix $\mathbf{\Sigma}_Y$ of $Y$ is given by $ \mathbf{\Sigma}_Y = 2\mathbf{\Sigma}_X^{\circ 2} = 2\mathbf{Q} $. So, it suffices to show

$$ (\ker \mathbf{\Sigma}_Y) \perp \mathbf{1}. $$

To this end, assume $\mathbf{v} \in \ker \mathbf{\Sigma}_Y$. Then

\begin{align*} \mathbf{v} \in \ker \mathbf{\Sigma}_Y &\quad\implies\quad 0 = \mathbf{v}^{\top}\mathbf{\Sigma}_Y\mathbf{v} = \mathsf{Var}(\mathbf{v}^{\top}Y) \\ &\quad\iff\quad \mathbf{v}^{\top}Y = \mathsf{E}[\mathbf{v}^{\top}Y] = \mathbf{v}^{\top}\mathbf{1} \quad \text{a.s.} \\ &\quad\iff\quad \mathbf{v}^{\top}(\mathbf{x}^{\circ 2}) = \mathbf{v}^{\top}\mathbf{1} \quad \text{for any } \mathbf{x} \in \operatorname{im}\mathbf{\Sigma}_X \end{align*}

In particular, plugging $\mathbf{x} = \mathbf{0}$ shows that $\mathbf{v}^{\top}\mathbf{1} = 0$ and hence $\mathbf{v} \perp \mathbf{1}$. $\square$


Addendum. This argument actually proves a more general statement:

Theorem. Let $\mathbf{A}$ be an $n\times n$ positive semi-definite matrix, and let $\mathbf{Q} = \mathbf{A}^{\circ 2}$ be the Hadamard/entrywise square of $\mathbf{A}$. Then

$$ \mathbf{v}^{\top}\mathbf{Q}^+ \mathbf{Q} = \mathbf{v}^{\top}, $$

where $\mathbf{v} = (a_{11}, a_{22}, \ldots, a_{nn})^{\top}$ is the main diagonal of $\mathbf{A}$.

15
On

We have $\mathbf1^\top Q^+Q=\mathbf1^\top$ if and only if the vector of ones lies inside the row space of $Q$, i.e., iff the equation $u^\top Q=\mathbf1^\top$ is solvable in $u$.

If $Q$ is invertible, there is nothing to prove.

Suppose $Q$ is singular. Let $r=\operatorname{rank}(R)$ and $q=\operatorname{rank}(Q)$. Then $r>0$ and $0<q<n$. By Schur product theorem, $Q=R\circ R$ is positive semidefinite. By reindexing the rows and columns of $R$ and $Q$ if necessary, we may assume that the leading principal $q\times q$ submatrix of $Q$ has rank $q$ (and is thus positive definite). Let us express $R$ as a Gram matrix: $$ R=\pmatrix{X_{q\times r}\\ Y_{(n-q)\times r}}\pmatrix{X^\top& Y^\top} =\pmatrix{XX^\top&XY^\top\\ YX^\top&YY^\top}. $$ Then $$ Q=\pmatrix{(XX^\top)^{\circ2}&(XY^\top)^{\circ2}\\ (YX^\top)^{\circ2} &(YY^\top)^{\circ2}}, $$ where $(\cdot)^{\circ2}$ means Hadamard/entrywise matrix square. E.g. $(XX^\top)^{\circ2}=(XX^\top)\circ(XX^\top)$.

Since $Q$ has rank $q$, all $(q+1)$-rowed leading principal submatrices of $Q$ are singular. In particular, $$ \pmatrix{(XX^\top)^{\circ2}&(Xy)^{\circ2}\\ (y^\top X^\top)^{\circ2}&1}\tag{$\dagger$} $$ is singular for every column $y$ of $Y^\top$ (or for every row $y^\top$ of $Y$). So, we must have $$ (y^\top X^\top)^{\circ2} \ \big((XX^\top)^{\circ2}\big)^{-1} \ (Xy)^{\circ2}=1.\tag{1} $$ Moreover, as $Q$ has the same rank as $(XX^\top)^{\circ2}$, the second block column of $Q$ must be equal to the first block column of $Q$ right-multiplied by some matrix $M$. Thus $$ Q=\pmatrix{(XX^\top)^{\circ2}& (XX^\top)^{\circ2}M\\ (YX^\top)^{\circ2} &(YX^\top)^{\circ2}M} $$ where $M=\big((XX^\top)^{\circ2}\big)^{-1}(XY^\top)^{\circ2}$. So, in the equation $u^\top Q=\mathbf1^\top$, if we partition $u^\top$ and $\mathbf1^\top$ as $(v^\top,w^\top)$ and $(e^\top,f^\top)$ respectively, we have $$ \begin{align} &u^\top Q=\mathbf1^\top\\ \Leftrightarrow\quad& u^\top Q\pmatrix{I_q&-M\\ 0&I_{n-q}}=\mathbf1^\top\pmatrix{I_q&-M\\ 0&I_{n-q}}\\ \Leftrightarrow\quad& (v^\top,w^\top) \pmatrix{(XX^\top)^{\circ2}&0\\ (YX^\top)^{\circ2} &0} =(e^\top,f^\top)\pmatrix{I_q&-M\\ 0&I_{n-q}}\\ \Leftrightarrow\quad& \begin{cases} v^\top(XX^\top)^{\circ2}+w^\top(YX^\top)^{\circ2}=e^\top,\\ e^\top M=f^\top. \end{cases}\tag{2} \end{align} $$ Since $(XX^\top)^{\circ2}$ is positive definite (hence invertible), the first equation in $(2)$ is always solvable. It follows that $u^\top Q=\mathbf1^\top$ is solvable if and only if $e^\top M=f^\top$, i.e., if and only if $e^\top\big((XX^\top)^{\circ2}\big)^{-1}\big((XY^\top)^{\circ2}\big)=f^\top$. Thus the problem now boils down to proving that for each column $y$ of $Y^\top$ (i.e., for each row $y^\top$ of $Y$), we have $$ e^\top \big((XX^\top)^{\circ2}\big)^{-1} (Xy)^{\circ2}=1.\tag{3} $$ Since $R$ has a diagonal of ones, $y$ is a unit vector. Pick any $r\times r$ orthogonal matrix $U$ whose first column is $y$. Define $Z=XU$ and let denote the $j$-th column of $Z$ by $z_j$. Then $z_1=Xy$. Condition $(1)$ and the statement-to-prove $(3)$ can be reformulated as $$ \begin{align} (z_1^{\circ2})^\top \big((ZZ^\top)^{\circ2}\big)^{-1} z_1^{\circ2}&=1,\tag{1a}\\ e^\top \big((ZZ^\top)^{\circ2}\big)^{-1} z_1^{\circ2}&=1.\tag{3a} \end{align} $$ If $Z$ has two or more columns (i.e., if $r\ge2$), then for any $j>1$, we consider the matrices $$ \begin{aligned} &R'=\pmatrix{Z\\ e_1^\top\\ e_j^\top}\pmatrix{Z^\top&e_1&e_j} =\pmatrix{ZZ^\top&z_1&z_j\\ z_1^\top&1&0\\ z_j^\top&0&1},\\ &Q'=(R')^{\circ2}=\pmatrix{(ZZ^\top)^{\circ2}&(z_1)^{\circ2}&(z_j)^{\circ2}\\ ((z_1)^{\circ2})^\top&1&0\\ ((z_j)^{\circ2})^\top&0&1},\\ \end{aligned} $$ Since $Q'$ is the entrywise square of the Gram matrix $R'$, it is positive semidefinite. However, as its leading principal $(q+1)\times(q+1)$ submatrix (i.e., the matrix $(\dagger)$) is singular, it cannot be positive definite. Hence $Q'$ is singular. In turn, the Schur complement of $(ZZ^\top)^{\circ2}$ in $Q'$, namely, the $2\times2$ matrix $$ \begin{aligned} &I_2- \pmatrix{((z_1)^{\circ2})^\top\\ ((z_j)^{\circ2})^\top} \big((ZZ^\top)^{\circ2}\big)^{-1} \pmatrix{(z_1)^{\circ2}&(z_j)^{\circ2}}\\ &=\pmatrix{0&-((z_1)^{\circ2})^\top(ZZ^\top)^{\circ2}\big)^{-1}(z_j)^{\circ2}\\ -((z_j)^{\circ2})^\top(ZZ^\top)^{\circ2}\big)^{-1}(z_1)^{\circ2} &1-((z_j)^{\circ2})^\top(ZZ^\top)^{\circ2}\big)^{-1}(z_j)^{\circ2}}, \end{aligned} $$ must be singular. Hence $(z_j^{\circ2})^\top \big((ZZ^\top)^{\circ2}\big)^{-1} z_1^{\circ2}=0$. So, regardless of whether $r\ge2$, we always have $(z_1^{\circ2})^\top \big((ZZ^\top)^{\circ2}\big)^{-1} z_1^{\circ2}=\left(\sum_{j=1}^r z_j^{\circ2}\right)^\top \big((ZZ^\top)^{\circ2}\big)^{-1} z_1^{\circ2}$. Hence $(1a)$ gives $$ \left(\sum_{j=1}^r z_j^{\circ2}\right)^\top \big((ZZ^\top)^{\circ2}\big)^{-1} z_1^{\circ2}=1. $$ Now, observe that each row of $Z$ is a unit vector. Thus $\sum_{j=1}^r z_j^{\circ2}=e$ and $(3a)$ follows.

0
On

This is not a new answer. After learning @user1551's answer for a week, I reorganized the proof and added some details.

When $\mathbf{Q}$ is of full rank, $\mathbf{Q}^+=\mathbf{Q}^{-1}$, obviously, there is $\mathbf1^\top\mathbf{Q}^{-1}\mathbf{Q}=\mathbf1^\top$.

When $\mathbf{Q}$ is singular, suppose $\text{rank}(\mathbf{Q})=q$. Suppose the $i_1,\dots,i_q$ rows of $\mathbf{Q}$ forms a maximal linearly independent group in the row space. Let $i_{q+1},\dots,i_n$ be the indices of the other rows. Sort the rows and columns of $\mathbf{Q}$ in the order of $i_1,\dots,i_q,i_{q+1},\dots,i_n$ and denote the sorted matrix as $Q$, whose leading principal $q\times q$ submatrix has full rank.

Let $R$ be the correlation matrix satisfying $Q=R\circ R$. Suppose the rank of $R$ is $r$. Since $R$ is a correlation matrix, it is a Gram matrix, which is a symmetric matrix of inner products. Thus, $R$ and $Q$ can be express as $$ R=\pmatrix{ X_{q\times r} \\ Y_{(n-q)\times r}} \pmatrix{X^\top& Y^\top} =\pmatrix{XX^\top&XY^\top\\ YX^\top&YY^\top},\quad Q=R^{\circ2}=\pmatrix{(XX^\top)^{\circ2}&(XY^\top)^{\circ2}\\ (YX^\top)^{\circ2} &(YY^\top)^{\circ2}} $$ where $(\cdot)^{\circ2}$ means element-wise square and $(XX^\top)^{\circ2}$ is a $q\times q$ full-rank submatrix. Since the column vectors in the first column partition of $Q$ is maximally linearly independent, the second column partition of $Q$ can be represented as a linear combination of them. Since $(XX^\top)^{\circ2}$ is invertible, let $M=((XX^\top)^{\circ2})^{-1}(XY^\top)^{\circ2}$. We have $$ Q=\pmatrix{(XX^\top)^{\circ2} & (XX^\top)^{\circ2}M \\ (YX^\top)^{\circ2} & (YX^\top)^{\circ2}M} $$ Since the diagonal elements of $XX^\top$ are ones, the row vectors of $X$ are unit vectors. Let $y_i$ denote any $i$-th ($1\le i\le n-q$) column of $Y^\top$. Since $\pmatrix{ (XX^\top)^{\circ2} & (Xy_i)^{\circ2} \\ (y_i^\top X^\top)^{\circ2} & 1}$ does not have full rank, we have $$ \begin{align} \left|\begin{matrix} (XX^\top)^{\circ2} & (Xy_i)^{\circ2} \\ (y_i^\top X^\top)^{\circ2} & 1 \\ \end{matrix}\right| = \left|(XX^\top)^{\circ2}\right|\left|1 - (y_i^\top X^\top)^{\circ2}\big((XX^\top)^{\circ2}\big)^{-1}(Xy_i)^{\circ2}\right| = 0 \\ \Longrightarrow\quad (y_i^\top X^\top)^{\circ2}\big((XX^\top)^{\circ2}\big)^{-1}(Xy_i)^{\circ2}=1\tag{1}\end{align} $$ Since $y_i^\top y_i=1$, let $U_{r\times r}$ be any orthogonal matrix with the first column being $y_i$. Let $Z_{q\times r}=(z_1,\dots,z_r)=XU$, where $z_1=Xy_i$. Since an orthogonal transformation preserves the length of a vector, the row vectors of $Z$ are unit vectors, i.e., $\sum_{j=1}^r z_j^{\circ2}=\mathbf1_q$. By substituting $X=ZU^\top$ into (1), we have

$$ (z_1^{\circ2})^\top \big((ZZ^\top)^{\circ2}\big)^{-1} z_1^{\circ2}=1\tag{1a} $$ When $r\ge2$, let $$ \begin{aligned} &R_{(q+2)\times(q+2)}'=\pmatrix{Z\\ e_1^\top\\ e_j^\top}\pmatrix{Z^\top&e_1&e_j} =\pmatrix{ZZ^\top&z_1&z_j\\ z_1^\top&1&0\\ z_j^\top&0&1} \\ &Q_{(q+2)\times(q+2)}'=(R')^{\circ2}=\pmatrix{(ZZ^\top)^{\circ2}&z_1^{\circ2}&z_j^{\circ2}\\ (z_1^{\circ2})^\top&1&0\\ (z_j^{\circ2})^\top&0&1} \\ \end{aligned} $$ where $e_j$ is a unit vector with the $j$-th element being one. Since $R'$ is a Gram matrix which is positive semidefinite, according to the Schur product theorem, $Q'$ is positive semidefinite. Since the determent of the leading principal submatrix of $Q'$ is $$ \left|\begin{matrix} (ZZ^\top)^{\circ2}&z_1^{\circ2} \\ (z_1^{\circ2})^\top&1 \\ \end{matrix}\right| =\left|\begin{matrix} (XX^\top)^{\circ2} & (Xy_i)^{\circ2} \\ (y_i^\top X^\top)^{\circ2} & 1 \\ \end{matrix}\right|=0 $$ $Q'$ does not have full rank. Since $(ZZ^\top)^{\circ2}$ is positive definite, the Schur complement of $(ZZ^\top)^{\circ2}$ of $Q'$ is not positive definite. Thus, we have ($(\text{a1})$ is used) $$ \begin{aligned} \left|Q'/(ZZ^\top)^{\circ2}\right| =& \left| I_2- \pmatrix{(z_1^{\circ2})^\top\\ (z_j^{\circ2})^\top} \pmatrix{(ZZ^\top)^{\circ2}}^{-1} \pmatrix{z_1^{\circ2}&z_j^{\circ2}}\right| \\ =& \left|\begin{matrix} 0 & -(z_1^{\circ2})^\top[(ZZ^\top)^{\circ2}]^{-1}z_j^{\circ2}\\ -(z_j^{\circ2})^\top[(ZZ^\top)^{\circ2}]^{-1}z_1^{\circ2} & 1-(z_j^{\circ2})^\top[(ZZ^\top)^{\circ2}]^{-1}z_j^{\circ2}\end{matrix}\right| \\ =& -\left((z_1^{\circ2})^\top[(ZZ^\top)^{\circ2}]^{-1}z_j^{\circ2}\right)^2=0 \\ \Rightarrow \quad & (z_j^{\circ2})^\top[(ZZ^\top)^{\circ2}]^{-1}z_1^{\circ2}=0 \\ \Rightarrow \quad & \sum_{j=2}^r(z_j^{\circ2})^\top[(ZZ^\top)^{\circ2}]^{-1}z_1^{\circ2}=0 \\ \Rightarrow \quad & \sum_{j=1}^r(z_j^{\circ2})^\top[(ZZ^\top)^{\circ2}]^{-1}z_1^{\circ2}=1 \\ \end{aligned} $$

Since $\sum_{j=1}^r z_j^{\circ2}=\mathbf1_q$, we have $\mathbf1_q^\top\left((ZZ^\top)^{\circ2}\right)^{-1}z_1^{\circ2}=1$. By substituting $Z=XU$ and $z_1=Xy_i$, we have $\mathbf1_q^\top\left((XX^\top)^{\circ2}\right)^{-1}(Xy_i)^{\circ2}=1$. Thus, $$ \begin{align} &\mathbf1_q^\top\left((XX^\top)^{\circ2}\right)^{-1}\pmatrix{(Xy_1)^{\circ2}&\cdots&(Xy_{n-q})^{\circ2}} \\ =&\mathbf1_q^\top\left((XX^\top)^{\circ2}\right)^{-1}(XY)^{\circ2}=\mathbf1_q^\top M=\mathbf1_{n-q}^\top \tag{2a}\end{align} $$ Since $\text{rank}(\begin{bmatrix}(XX^\top)^{\circ2} & (XY^\top)^{\circ2}\end{bmatrix})=\text{rank}(\begin{bmatrix}(XX^\top)^{\circ2} & (XY^\top)^{\circ2} & \mathbf1_q\end{bmatrix})=q$, the system of linear equations $$ \begin{bmatrix}(XX^\top)^{\circ2} & (XY^\top)^{\circ2}\end{bmatrix}u_{n\times1}=\mathbf1_q \tag{2b} $$ with the $n$ variables being elements of $u$ has solutions. By rearranging $\text{(2a)}$ and $\text{(2b)}$ in matrix form, we have $$ \begin{align} \mathbf1_n^\top\pmatrix{I_q&-M\\ 0 & I_{n-q}} =& u^\top\pmatrix{(XX^\top)^{\circ2}&0\\ (YX^\top)^{\circ2} &0} \\ =& u^\top\pmatrix{(XX^\top)^{\circ2}&(XY^\top)^{\circ2}\\ (YX^\top)^{\circ2} &(YY^\top)^{\circ2}}\pmatrix{I_q&-\big((XX^\top)^{\circ2}\big)^{-1}(XY^\top)^{\circ2}\\ 0&I_{n-q}} \\ =& u^\top Q\pmatrix{I_q&-M\\ 0&I_{n-q}} \\ \end{align} $$ Since $\pmatrix{I_q&-M\\ 0&I_{n-q}}$ has a determinant of one and is therefore invertible, we have that the system of linear equations $\mathbf1^\top=u^\top Q$ has solutions. Thus, $\mathbf1^\top Q^+Q = u^\top QQ^+Q = u^\top Q = \mathbf1^\top$ has solutions, and thus, $\mathbf1^\top Q^+Q = \mathbf1^\top$ holds regardless of $u$.

Since $Q$ is obtained from $\mathbf{Q}$ by row and column-switching transformations. By denoting the corresponding elementary matrices as $T_1,\dots,T_s$, we have $Q=T_s\cdots T_1\mathbf{Q}T_1\cdots T_s$. Since any $T_i$ is an orthogonal matrix, there is $(T_i\mathbf{Q}T_i)^+=T_i^\top\mathbf{Q}^+T_i^\top$, and since

$T_i^{-1}=T_i=T_i^\top$, we have $$ \mathbf1^\top Q^+Q = \mathbf1^\top (T_s\cdots T_1\mathbf{Q}T_1\cdots T_s)^+(T_s\cdots T_1\mathbf{Q}T_1\cdots T_s) =\mathbf1^\top T_s\cdots T_1\mathbf{Q}^+\mathbf{Q}T_1\cdots T_s=\mathbf1^\top $$ i.e., $(T_s\cdots T_1\mathbf{Q}^+\mathbf{Q}T_1\cdots T_s)$ is a matrix with each column sums to one. Since $\mathbf{Q}^+\mathbf{Q}$ is obtained from it by row and column-switching transformations, the summation over each column of $\mathbf{Q}^+\mathbf{Q}$ is still one, i.e., $\mathbf1^\top\mathbf{Q}^+\mathbf{Q}=\mathbf1^\top$.