So I'm referring to the paper Exact Matrix Completion via Convex Optimization: https://arxiv.org/pdf/0805.4471.pdf
If you look on page 6 of this paper,
The coherence of a matrix $\mu(U) = \frac{n}{r} \max_{1 \le i \le n} \|P_Ue_i\|_2^2$ where $i$ is the standard basis vector with a $1$ in the $i$-th entry and $U$ is a subspace of $\mathbb{R}^n$ of dimension $r$.
Now we are considering some matrix $M \in \mathbb{R}^{n_1 \times n_2}$ which is expressed as $\sum_ {1 \le k \le r} \sigma_k u_kv_k^*$ (SVD). Now there is a property A0 which states that the coherence matrices obey $\max(\mu(U), \mu(V)) \le \mu_0$.
Now if you look a bit below they show that $\left|\sum_k u_{ik}v_{jk} \right| \le \sqrt{\sum_k|u_{ik}^2|}\sqrt{\sum_k|v_{jk}^2|}$ by cauchy schwarz ( this part is obvious to me)...but then they say that this whole thing is $\le \frac{\mu_0 r}{\sqrt{n_1n_2}}$. Now how they jumped to this makes no sense to me I do notice that $\mu(U) \le \mu_0 \implies \max_i \|P_Ue_i\|_2^2 \le \frac{\mu_0r}{n_1}$. Hence if you multiply the lengths of the maximum projections onto $U$ and $V$ you get the quantity $\frac{\mu_0 r}{\sqrt{n_1n_2}}$. Notice for example $u_{ik}$ is the $i$th entry of the $k$th left singular vector and similarly for $v_{jk}$. If someone could help me prove this result it would be much appreciated.
Note that $\max_{i} \| P_U e_i \|_2^2 \leq \frac{\mu_0 r}{n_1}$ implies that $\max_{i} \| u_{i} \|_2^2 \leq \frac{\mu_0 r}{n_1}$, where $u_i$ is the $i^{\text{th}}$ row of the matrix of left singular vectors $U$. This is because $P_U e_i = UU^{\mathsf{T}} e_i$ and thus $\| UU^{\mathsf{T}} e_i \|_2 = \| U^{\mathsf{T}} e_i \|_2 = \|u_i\|_2$ for each $i$. The same can be shown to hold for $P_V$: $\max_{i} \|v_i\|_2^2 \leq \frac{\mu_0 r}{n_2}$.
Now using these identities, observe that
$$ \sqrt{\sum_{k} u_{ik}^2} \leq \max_{i} \sqrt{\sum_{k} u_{ik}^2} = \max_{i} \| u_i\|_2 \leq \sqrt{\frac{\mu_0 r}{n_1}}, \\ \sqrt{\sum_{k} v_{jk}^2} \leq \max_{j} \sqrt{\sum_{k} v_{jk}^2} = \max_{j} \| v_j\|_2 \leq \sqrt{\frac{\mu_0 r}{n_2}}. \\ $$
Multiplying the two gives the claimed result.