I want to show that for any square matrix $A \in \mathbb{C}^{n \times n}$ and any unitary $X \in \mathbb{C}^{n\times n}$ (that is $XX^* = X^* X = I$) the following holds true
$$ \text{Re} \{\text{trace}(X^* A)\} \leq \sum_{i =1}^{rank(A)}\sigma_{i}(A) \,, $$
where $\text{Re}\{\cdot\}$ is "a real part of", and $\sigma_i(A)$ are the singular values of matrix $A$.
I was able to show the result in what I feel is an awkward way. I am looking for a shorter way which it is based on the well established properties of singular value decomposition and trace, without any lengthy matrix decompositions as I did.
Here is my solution:
(Note: $A = U \Sigma V^*$ - is the SVD of the matrix $A$) $$ \text{Re}\{tr(X^*A)\} = \text{Re}\{tr(X^*U\Sigma V^*)\} \underbrace{=}_{\text{tr(AB) = tr(BA)}} \text{Re}\{tr(V^*XU\Sigma ) \} =\text{Re}\{tr(S\Sigma)\}\,, $$
where $S= V^*XU$ is unitary matrix, thus the rows$\backslash$columns are orthonormal (in particular columns have norm $1$).
Writing $S = \widetilde{S_1} + \cdots + \widetilde{S_n}$, where the matrices $\widetilde{S_i}\in \mathbb{C}^{n \times n}$ are matrices where all the elements are zero except for the column $i$ which equals to column $i$ in $S$ (that is I decompose $S$ to matrices containing single column of $S$).
This yields
$$ S\Sigma = \sigma_1 \widetilde{S_1} + \cdots + \sigma_n \widetilde{S_n}\,. $$
Therefor I get
$$ Re\{tr(S\Sigma)\} = Re\{tr(\sigma_1 \widetilde{S_1})+ \cdots + tr(\sigma_n \widetilde{S_n})\} = Re\{\sigma_1tr(\widetilde{S_1})+ \cdots + \sigma_n tr( \widetilde{S_n})\}\,. $$ Now, in general every element in the vector $v_i$ can be bounded by the norm of the vector, since columns of the matrix $S$ are of norm one, the elements in $\widetilde{S}_i$ can be bounded by $1$ to yield $$ |tr(\widetilde{S}_i)| \leq 1\, ,\forall i\,.$$
Overall this implies
$$ Re\{tr(S\Sigma)\} = Re\{\sigma_1tr(\widetilde{S_1})+ \cdots + \sigma_n tr( \widetilde{S_n})\}\leq \sum_{i= 1}^{n} \sigma_i = \sum_{i= 1}^{rank(A)} \sigma_i $$
Added later: Here I found another question which proves the same result.
The polar decomposition of $A$ is $A = W R$ where $W$ is unitary and $R = (A^* A)^{1/2}$ has eigenvalues $\sigma_j$, the singular values of $A$ (including zeros). In terms of the SVD, you could write this as $A = U \Sigma V^* = (U V^*)(V \Sigma V^*)$. If $v_j$ are the corresponding orthonormal basis of eigenvectors of $R$,
$$ \text{tr}(X^* A) = \text{tr}(X^*W R) = \sum_j v_j^* X^* W R v_j = \sum_j \sigma_j v_j^* X^*W v_j$$ Now use the fact that $|v_j^* X^*W v_j| \le \|v_j\| \|X^* W v_j\| = 1$.