Let's say we have a matrix A which is $m\times n$ matrix and $n \gg m$. We multiply A with a random gaussian matrix G with $0$ mean and $I$ variance and dimension $n \times (k+p)$ where $p$ is an oversampling parameter and $k$ is the desired rank and $(k+p) < m$. Let $Y = AG$. When we do QR factorization on $Y$ and get the column subspace $Q$, the $\| A - QQ^TA\|$ is probabilistically bounded (ref).
My problem is following: instead of multiplying $A$ with a highly low-ranked matrix (lower than m), we multiply it with an $n\times t$ Gaussian matrix where $t \geq m$. Then apply SVD on $Y$: $\mbox{SVD}(Y) = U \Sigma V^T$. Then use $U[:,k]$ for low rank approximation. Is there any bound for $\| A - U[:,k]U[:,k]^TA\|$? Experimentally, the results are much better but could not find an exact proof for this algorithm.