I have a problem proving a matrix identity that comes from a data validation and reconciliation problem. So far, we could "see" numerically that the following identity seems to hold, but could not prove it. Here it goes:
Let $S\in \mathbb{R}^{m\times m}$ be a symmetric, positive definite matrix, and $F\in \mathbb{R}^{l\times m}$ and $G\in \mathbb{R}^{l\times n}$ matrices with full rank. Define $$Z=\begin{bmatrix}S^{-1} & 0 & F^T \\ 0 & 0 & G^T \\ F & G & 0 \end{bmatrix}$$ and assume $Z$ to be invertible. Now define $P_m=\begin{bmatrix}I_m&0_{m,n+l}\end{bmatrix}$ and $P_l=\begin{bmatrix}0_{l,m+n}&I_l\end{bmatrix}$.
What we need to compute is the following matrix $$T = (P_m Z^{-1} P_l^T) (FSF^T) (P_m Z^{-1} P_l^T)^T$$ It seems, however, that $T$ can also be computed easier, namely by $$\tilde T=S - P_m Z^{-1} P_m^T = (P_m Z P_m^T)^{-1} - P_m Z^{-1} P_m^T.$$ Numerical experiments up to $l=50$ and choosing arbitrary $m$ and $n$ such that $n\leq l$ and $l\leq m+n$ (otherwise $Z$ is not invertible) showed that $\|T-\tilde T\|_F<10^{-7}\|S\|_F$ for randomly generated matrices $S,F,G$, fulfilling the stated assumptions.
However, I cannot prove it analytically. So far, I've tried using the Schur's complement, the Woodbury matrix identity and the binomial inversion theorem on $Z$, but can't make much progress there, because all those identities seem to take me nowhere. One more observation is that $T$ can also be written as $T = USU^T$ with $U=P_m Z^{-1} P_l^T F$ where $U$ is a projection matrix (i.e. $U^2=U$) with rank $l-n$ and the image of of $P_mZ^{-1}P_m^T$ seems to be equal to the kernel of $P_m Z^{-1} P_m^T$.
Note: just a short explanation what all those matrices mean: $S$ is the covariance matrix of the input data, $F$ and $G$ are constrained matrices of measured and unmeasured input data, $T$ is the covariance matrix of the reconciliation vector, i.e. the correction that need to be applied to the input data. The formula for $T$ is just the typical forward error propagation under the assumption that everything is nicely linear and Gaussian. The blocks in $Z$ correspond to measured data, unmeasured data and Lagrangian multipliers respectively.
In general, $Z$ may be singular. For example, consider $m=n=1<l=2$ and $$ Z=\begin{pmatrix} 1&0&1&0\\ 0&0&1&0\\ 1&1&0&0\\ 0&0&0&0 \end{pmatrix}. $$ So, suppose that $Z$ is invertible. Then we can prove the identity using Schur complement. The Schur complement of $S^{-1}$ in $X$ is given by $Y = \begin{pmatrix}0&G^T\\G&-FSF^T\end{pmatrix}$. Then $$ Z^{-1} = \begin{pmatrix} S + S (0, F^T) Y^{-1} \begin{pmatrix}0\\F\end{pmatrix}S & -S(0, F^T) Y^{-1}\\ \\ \ast & \ast \end{pmatrix}. $$ Now, let $Y^{-1} = \begin{pmatrix}\ast&R^T\\R&Q\end{pmatrix}$, where $Q$ is symmetric because $Y$ is symmetric. Then $P_m Z^{-1} P_l^T = -S F^T Q$ and \begin{align*} H &:= FSF^T,\\ T &= S F^T Q (FSF^T) QF S = SF^T QHQFS,\\ \widetilde{T} &= S - P_m Z^{-1} P_m^T = S - \left(S + S (0, F^T) Y^{-1} \begin{pmatrix}0\\F\end{pmatrix} S \right) = - S F^T Q F S. \end{align*} Therefore, to show that $T=\widetilde{T}$, we need to show $SF^T QHQFS=- SF^T QFS$. Hence it suffices to verify that $QHQ=-Q$. From the identity $YY^{-1}=I$, we get \begin{align*} G^TQ&=0,\\ GR^T-HQ&=I. \end{align*} Left multiply the second identity by $Q$, we get $Q(GR^T-HQ)=Q$. Yet, by the first identity, we have $Q(GR^T-HQ) = (RG^TQ)^T - QHQ = -QHQ$. Hence we are done.