Given a full row-rank $m\times n$ matrix $A$ with $n > m$ and a vector $v\in\mathbb{R}^n$, what is the most computationally efficient way of computing this? $$ x = A^\top(AA^\top)^{-1}A v. $$ I have some options:
Naive Computation
- Compute $Av$.
- Compute $AA^\top$
- Solve the system $AA^\top y = Av$ for $y$. (e.g. using
numpy.linalg.solve()) - Compute $x = A^\top y$.
QR Decomposition
- Compute $Q = qr(A^\top)$ (e.g. using
scipy.linalg.qr(, mode='economic')) - Compute $x = QQ^\top v$
Cholesky Decomposition
- Compute $L = \text{Cholesky}(AA^\top)$ lower triangular (e.g.
scipy.linalg.cho_factor()) - Solve the system $LL^\top y = Av$ for $y$ (e.g.
scipy.linalg.cho_solve()) - Compute $y = A^\top y$
LU Decomposition
- Compute $JJ^\top$
- Compute $LU = \text{LuDecomposition}(JJ^\top)$ (e.g.
scipy.linalg.lu_factor()) - Solve $LU y = Av$ (e.g.
scipy.linalg.lu_solve()) - Compute $x = A^\top y$
Are there other alternatives?