Let $A, B$ be matrices over $\mathbb{C}$ of the same dimension (not necessarily square). With $'$ denoting conjugate-transpose, and tr the trace, show for $n\in\mathbb{N}$ that
$ 2\,\mathrm{tr} [(AB')^n] \le \mathrm{tr}[(AA')^n] + \mathrm{tr}[(BB')^n] $.
Now I have proven this, by a slightly longwinded, but nevertheless attractive, route, which consists in showing that RHS$-$LHS is a sum of squares. Write $\|M\|^2 := \mathrm{tr}(MM')$. When $n=1$, it is $\|A-B\|^2$, and when $n=2$, it is $\|AA'-BB'\|^2+\|AB'-BA'\|^2$. By an iterative argument I can show that this is true for all $n$: basically I write RHS$-$LHS as a sum of squares plus a remainder term. I then have a procedure that makes the remainder term smaller and smaller, and I use compactness to show that in the limit the remainder goes to $0$. However, I'm in effect using Analysis to prove what looks like an algebraic result, and in general my expansion of RHS$-$LHS seems always to give rise to a sum of squares with rational coefficients, which my Analysis-based proof won't show. So I'm wondering whether there is a neater way of doing it.
As applications, I'm wondering about combinatorics and maybe something to do with path integrals(?)
old thread but no response.
note on dimensions
since $ \text{trace}\Big(\big(AB^*\big)^n\Big) = \text{trace}\Big(\big(B^*A\big)^n\Big) $
and we are dealing with rectangular matrices, to streamline thing, assume WLOG that the left one is tall and skinny, and we call it $A$, and the right one is short and fat and we call it $B^*$. Then we can append zero columns (on the right) to A until it is square, and append zero rows to the bottom of $B^*$ until it is square. This doesn't change the product $\big(AB^*\big)$ -- so in particular its singular values and eigenvalues are unchanged-- and also doesn't change the nonzero singular values of $A$ (check $AA^*$ is the same before and after augmentation) and similarly doesn't change the nonzero singular values of $B^*$. Working with this augmented square setup avoids a lot of potential unpleasantness in the below. So we proceed assuming WLOG that each matrix is $\text{m x m}$.
main argument
having singular values in the usual ordering of $\sigma_1\geq \sigma_2 \geq .... \geq \sigma_m$
This short answer to this comes from the fact that the singular values of (AB) weakly majorize the absolute values of the eigenvalues and
$\Sigma_{AB} \preceq_w \Sigma_{A} \Sigma_{B} $
where singular values of (AB) are weakly majorized by those of $\Sigma_{A} \Sigma_{B}$
(in both cases we may instead speak of log majorization)
combine this with the fact that $u \mapsto u^n$ is convex and increasing for $u \geq 0$ and we get the result. In particular:
$\big \vert \text{trace}\Big(\big(AB^*\big)^n\Big)\big \vert$
$=\big \vert\sum_{k=1}^m \big(\lambda_k^{(AB)}\big)^n\big \vert$
$\leq \sum_{k=1}^m \big \vert\lambda_k^{(AB)}\big \vert^n$
$\leq \sum_{k=1}^m \big(\sigma_k^{(AB)}\big)^n$
$= \text{trace}\Big(\big(\Sigma_{AB^*}\big)^n\Big)$
$\leq \text{trace}\Big(\big(\Sigma_{A}\Sigma_{B^*}\big)^n\Big)$
$= \text{trace}\Big((\Sigma_{A})^n(\Sigma_{B^*})^n\Big)$
$= \text{trace}\Big(\big((\Sigma_{A})^{2n}\big)^\frac{1}{2}\big((\Sigma_{B^*})^{2n}\big)^\frac{1}{2}\Big)$
$\leq \text{trace}\Big(\frac{1}{2}\big(\Sigma_{A}\big)^{2n}+ \frac{1}{2}\big(\Sigma_{B^*}\big)^{2n}\Big)$
$= \frac{1}{2}\text{trace}\Big(\big(\Sigma_{A}\big)^{2n}\Big)+ \frac{1}{2}\text{trace}\Big(\big(\Sigma_{B^*}\big)^{2n}\Big)$
$= \frac{1}{2}\text{trace}\Big(\big(AA^*\big)^n\Big) + \frac{1}{2}\text{trace}\Big(\big(BB^*\big)^{n}\Big)$