Orthogonal projection on a lower triangular non square block matrix $A \in \mathbb R^{m=4k,n=7k}$ such that $A_{i,j}\in \{0,1\}$

45 Views Asked by At

If i have a very large matrix matrix $$A \in \mathbb R^{m=4k,n=7k}$$ such that $$A_{i,j}\in \{0,1\}$$ and $A$ is a lower block matrix with diagonal block $$B_t\in \mathbb R^{4,7}$$ and block left of the diagonal as $$C_t\in\mathbb R^{4,7(t-1)}$$ where $Rank(C_t)\leq 2$ and $RowRank(C_t)\leq 1$ and $ColumnRank(C_t)\leq 1$ and $t$ is the diagonal block position. Clearly calculating $A^+b$ for any given $b$ is very fast (With backward replacement). Problem is the resulting $x$ is hardly the minimum norm solution, as in for $$x=A^+b+(I-A^+A)w$$ probability P($w \not = 0)$ is high when i take the advantage of the matrix structure. However doing pseudo inversion of the whole matrix might take days as opposed to seconds. My problem is projecting a vector $c$ as in $w = c$ while taking the advantage of the structure. well let's take a simple example where i already have $x_0$ and $c=0$ and i get $x\not = A^+b$ (high probability this would happen) if I strictly obtain my solutions by taking advantage of the lower diagonal matrix structure.

Why will it give the wrong projections:

already $x_0=A^+b+(I+A^+A)r$ with some $r$ so the final projection ends up been inaccurate. Infact it's easy to see the first blocks are greedy to satisfy the projection as the backwards replacement algorithm goes downwards.

What i have tried:

$LDL^T$ block decomposition (also fast) of $A^TA$. but that will not make anything easier as $(LDL^T)^+\not = (L^T)^+D^+L^+$

Edit: This block $LDL^T$ has $L$ is Invertible while diagonal block matrix $D$ is not invertible. Each diagonal block $U$ of $D$ is represented as $B_t^+B_t$ if this helps and also $B_tB_t^+$ is invertible and also L has 1s on the diagonal. also $A=\{B_t\}L^T$

I know this is not a CS forum, I honestly do not need a CS solution but rather a mathematical solution and intuition.