Help on Inequality with Matrix Singular Values

45 Views Asked by At

Given $A \in \mathbb{R}^{p \times d}$, $X \in \mathbb{R}^{p \times r}$ and $Y \in \mathbb{R}^{d \times r}$, where each of $X$ and $Y$ has orthonormal columns and $r=\operatorname{rank}(A)\leq \min{(p, d)}$, prove that $|\langle X, AY \rangle| \leq \sqrt{r\sum_{i=1}^r \sigma_i^2(A)}$, where $\sigma_i(A)$ is the $i$th largest singular value of $A$.

I've rewritten $\sum_{i=1}^r \sigma_i^2(A)$ as $\Vert A \Vert_F$, and rearranged the expression, although I'm not quite sure what to do, and don't know if I'm even going in the right direction.

1

There are 1 best solutions below

2
On

By changing two orthonormal bases, you may assume that $A$ is a singular value matrix. It follows that $$ |\langle X,AY\rangle|=\sum_{k=1}^r\sigma_i(A)\langle X_{\ast k},Y_{\ast k}\rangle\le\sum_{k=1}^r\sigma_i(A). $$ Let $s=(\sigma_1(A),\ldots,\sigma_r(A))^T$ and $e=(1,\ldots,1)^T$. Then $$ \sum_{k=1}^r\sigma_i(A)=\langle s,e\rangle\le\|s\|_2\|e\|_2=\sqrt{r\sum_{i=1}^r\sigma_i^2(A)}. $$