This question is a generalisation of Eigenvalues of $AB$ and $BA$ where $A$ and $B$ are rectangular matrices which itself is a generalisation of Eigenvalues of $AB$ and $BA$ where $A$ and $B$ are square matrices.
Let $A$ be an $m \times n$ matrix and B and $n \times k$ matrix. Obviously, the matrix product $AB$ is possible, whereas the product $BA$ is not. Assume $n<k<m$, such that $AB$ is a large matrix.
Is there anything we can do to either matrix $A$ or $B$, such that the product $BA$ becomes possible and such that the eigenvalues of $BA$ say something about the eigenvalues of the original $AB$?
I am thinking of procedures such as:
- Truncating $A$ (making it $k \times n$)
- Appending some values to $B$ (making it $n \times m$)
- Interpolating values in $B$
- Taking random samples
- etc.
Motivation 1 (theoretical): The matrix $AB$ is large and clearly degenerate. Therefore, there must be a smaller matrix which captures the same information as $AB$ (i.e. has the same eigenvalues). If $k=m$, then $BA$ would be such a smaller matrix, as discussed in Eigenvalues of $AB$ and $BA$ where $A$ and $B$ are rectangular matrices.
Motivation 2 (practical): The eigendecomposition of a very large matrix is computationally expensive and may require special hardware. If the problem can be simplified, e.g. by decomposing the smaller $BA$, then the analysis can be performed more efficiently.
Alternatively, is there anything we can say about the eigenvalues of $AB$ without performing the product, i.e. based on analyses of $A$ and $B$ separately.
EDIT:
1) Yes. There is something which can be done. Let $A$ an $m$ by $n$ matrix and let $B$ by an $n$ by $k$ matrix with $n < k < m$. Then $AB$ is defined, but $BA$ is not. Augment $B$ with $m-k$ columns of zeros, so that $C = [B \: 0]$ is $n$ by $m$. This is a trivial extension of the operator $B$ to a larger subspace. Then $AC$ and $CA$ are both defined. The matrix $AC$ is $m$ by $m$ (large), while the matrix $CA$ is $n$ by $n$, but they have the same nonzero eigenvalues.
2) No. In general, there is little which can be learned from the separate analysis of $A$ and $B$. One extreme example is the case of nonsingular $A$ and $B = A^{-1}$. The work involved in producing from scratch the eigendecomposition of $A$ and $B$ is all for naught, as $AB = I$. Another example is the case where $A$ and $B$ are block diagonal matrices with $$A = \begin{bmatrix} A_{11} & 0 \\ 0 & 0 \end{bmatrix}, \quad B = \begin{bmatrix} 0 & 0 \\ 0 & B_{22} \end{bmatrix},$$ so that $AB=0$.
Original answer:
This may not be the answer that you are looking for, but knowing the SVD decomposition of a matrix is frequently useful and it might very well solve your underlying problem.
First let me justify a slight change in notation. By convention, vectors are column vectors unless explicitly identified as row vectors and this convention carries to rectangular matrices.
Here we are interested in a product $AB^T$ where $A \in \mathbb{R}^{m \times n}$ and $B \in \mathbb{R}^{k \times n}$ are tall matrices, i.e. $n \ll m$ and $n \ll k$. In particular, $m$ and $k$ are so large that we have neither the space to store the product explicitly nor the time to compute the SVD of the product even if enough storage could be made available.
Below follow the standard trick for this situation.
We can compute economy size QR factorizations of $A$ and $B$, i.e. $$A = QR, \quad B = VS,$$ where $Q \in \mathbb{R}^{m \times n}$ and $R \in \mathbb{R}^{n \times n}$, and $V \in \mathbb{R}^{k \times n}$ and $S \in \mathbb{R}^{n \times n}$. This is essentially Gram-Schmidt orthogonalization of $A$ and $B$. The costs are $O(mn^2)$ and $O(kn^2)$ arithmetic operations. Then $$AB^T = QRS^T V^T.$$
The small matrix $RS^T \in \mathbb{R}^{n \times n}$ requires $O(n^3)$ operations to form and another $O(n^3)$ operations to obtain its SVD, i.e. $$ RS^T = \bar{Q} \Sigma \bar{V}^T. $$ Returning to the product $AB^T$, we see that we have in fact computed its SVD, as $$ AB^T = Q R S^T V^T = Q \bar{Q} \Sigma \bar{V}^T V^T = (Q \bar{Q}) \Sigma (V \bar{V})^T.$$