Is it possible to define matrices $A \in \mathbb{R}^{m \times r}$ and $B \in \mathbb{R}^{r \times n}$, with $m > n \geq r$, so that the pseudoinverse of their product, $(A \cdot B)^+$, can be computed easily (e.g. by matrix multiplications, transpositions)?
I understand that the singular value decomposition could be used for that, by defining $A=U \cdot \Sigma$ and $B=V^T$, which would lead to $(A \cdot B)^+ = V \Sigma^+ U^T$.
However, I would like to avoid computations with large matrices from $\mathbb{R}^{m \times m}$ (like $U$) due to memory restrictions.
Update: given the nature of my problem, $A$ should be full column rank (like $(A\cdot B)^+$) and $B$ should be full row rank, so I cannot define $A$ to be diagonal-like.
Update 1: given the nature of my problem, $A$ and $B$ should be "dense", in the sense that most of their elements are non-zero.
Update 2: it would also be Ok if instead of having a multiplication of two matrices, we had a series of simple operations on N matrices, as long as the result is $m \times n$ and the dimentionality of the matrices involved is at most $m\times n$