Suppose $\mathbf{H}=\mathbf{F\Lambda G}^*$ is the SVD decomposition, where $\mathbf{\Lambda}$ usually has the same dimension as $\mathbf{H}$. However, I am wondering whether it is possible to somehow ensure that $\mathbf{\Lambda}$ is a square diagonal matrix, by some row or column augmentation, irrespective of the dimensions of $\mathbf{H}$. The following reference (taken from the field of network information theory, which is my area) asserts that it is possible, in the Appendix, but without giving any reference or justification.
Assume that the elements are complex numbers, for the sake of generality.
Cheers!
[1] Sriram Vishwanath, Nihar Jindal, and Andrea J. Goldsmith. October, 2003. “Duality, Achievable Rates, and Sum-Rate Capacity of Gaussian MIMO Broadcast Channels.” IEEE Transactions on Information Theory 49(10): 2658–68. http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=1237143.
$$ \mathbf{H}=\mathbf{F\Lambda G}^* $$ If $\mathbf H \in \mathbb R^{\ell\times n}$, one can state the theorem in a way in which $\mathbf F\in \mathbb R^{\ell\times\ell}$ is an orthogonal (or maybe unitary?) matrix, $\mathbf\Lambda\in\mathbb R^{\ell\times n}$ is a diagonal matrix with diagonal entries $\lambda_1\ge \cdots \ge \lambda_r$ where $r$ is the rank of $\mathbf H$, and $\mathbf G\in\mathbb R^{n\times n}$ is an orthogonal (or unitary?) matrix with adjoint $\mathbf G^*$.
But $\mathbf H$ has only $r\le\ell$ independent columns, and every column of $\mathbf H$ is in the column space of $\mathbf F$. Just from the definition of matrix multiplication, you can see that the first $r$ columns of $\mathbf F$ are an orthonormal basis of the column space of $\mathbf H$, and the last $\ell-r$ columns of $\mathbf F$ just get multiplied by $0$ in the product $\mathbf {F\Lambda}$. Thus if one discards the last $\ell-r$ columns of $\mathbf F$ and the $\ell-r$ rows of $\mathbf\Lambda$, one does not actually alter the product.
A similar argument applies to $\mathbf G$, with rows of $\mathbf G^*$, thus columns of $\mathbf F$.