Computation of $\phi\phi^T$ where $\phi$ is a vector that depends on semi-orthogonal matrices.

24 Views Asked by At

Let $r \leq \min(m,n)$ and $U \in \mathbb{R}^{m \times r}$, $V \in \mathbb{R}^{n \times r}$ be matrices such that $U^TU=V^TV=I_r$. Let also $U_{\perp} \in \mathbb{R}^{m \times (m-r)}$ and $V_{\perp} \in \mathbb{R}^{n \times (n-r)}$ be two matrices such that $ \begin{bmatrix} U & U_{\perp} \\ \end{bmatrix}$ and $ \begin{bmatrix} V & V_{\perp} \\ \end{bmatrix}$ are orthogonal matrices.

For every $1 \leq i \leq m$ and $1 \leq j \leq n$, let $$\phi(i,j)=\begin{bmatrix} \operatorname{vec}(U_iV_j^T) \\ \operatorname{vec}(U_iV_{\perp,j}^T) \\ \operatorname{vec}(U_{\perp,i}V_j^T) \end{bmatrix} \in \mathbb{R}^{d}, $$where $d:=r(m+n)-r^2$, $A_i$ is the vector equal to the (transposed) $i$-th row of the matrix $A$ and $\operatorname{vec}$ is the vectorization operation.

I am wondering wether or not we have $$B:=\sum_{1 \leq i \leq m, 1 \leq j \leq n}\phi(i,j)\phi(i,j)^T=I_d.$$ I can show that it is true for $m=n=2$ and $r=1$. Indeed, in this case, $U_i$, $U_{\perp,i}$, $V_j$ and $V_{\perp,j}$ are just scalars ; the terms in the diagonal of $B$ are $\sum_{i}U_i^2\sum_{j}V_j^2$, $\sum_{i}U_i^2\sum_{j}V_{\perp,j}^2$ and $\sum_{i}U_{\perp,i}^2\sum_{j}V_j^2$, which are all equal to $1$ because of the orthogonality property. Finally, the terms outside the diagonal will either have $\sum_{i}U_{i}U_{\perp,i}$ or $\sum_{j} V_jV_{\perp,j}$ as a factor, which are equal to $0$ because $U$,$U_{\perp}$ (resp $V$,$V_{\perp}$) are columns of the orthogonal matrix $ \begin{bmatrix} U & U_{\perp} \\ \end{bmatrix}$ (resp. $ \begin{bmatrix} V & V_{\perp} \\ \end{bmatrix}$ ).

I am struggling to handle the general case because I can't find a clean way to express $\phi(i,j)\phi(i,j)^T.$ If the above equality happens to be false in general, I would still be interested in the invertibility of $B$, and its minimal eigenvalue (I would really like to have $\lambda_{\min}(B) \geq 1$).