Currently I am solving an optimization problem that could be written as follows:
$$\min J= \sum_{i=1}^N {(q_i^H\Lambda q_i)}^{\frac{1}{3}} $$
subject to $\{q_i\}_{i\in [1..N]}$ forming an orthonormal basis.
$\Lambda$ is a symmetric matrix with size $N\times N$ with $u_i$ eigenvectors and $\lambda_i$ eigenvalues.
The $q_i$ are column vectors with size $N\times 1$ that collectively form an orthonormal basis.
Do you know any method to solve this problem? What we want to find out are expressions of the $q_i$.
We are asked to minimize objective function $J$ by selecting an orthonormal basis $\{q_1,\ldots ,q_n\}$, where:
$$ J = \sum_{i=1}^n (q_i^H \Lambda q_i)^{\frac{1}{3}} $$
Since $\Lambda$ is (real?) symmetric (Hermitian?), $\Lambda$ can be assumed diagonal without loss of generality, by conflating the required orthogonal (unitary?) similarity transformation with our choice of basis $\{q_i\}$.
The odd-root exponent $\frac{1}{3}$ makes it possible to treat (real) eigenvalues $\lambda_i$ of either sign as well as zero.
Outside of perhaps $n=2$, I don't see much prospect of "expressions for the $q_i$" that explicitly minimize $J$. The case $n=2$ is illuminating for suggesting methods for numerical optimization.
First note that for $\Lambda = \left( \begin{array} {cc} A & 0 \\ 0 & B \end{array} \right)$ the function $J$ is:
$$ \left[ \left( \begin{array} {cc} \cos \theta & \sin \theta \end{array} \right) \Lambda \left( \begin{array} {c} \cos \theta \\ \sin \theta \end{array} \right) \right]^{\frac{1}{3}} + \left[ \left( \begin{array} {cc} -\sin \theta & \cos \theta \end{array} \right) \Lambda \left( \begin{array} {c} -\sin \theta \\ \cos \theta \end{array} \right) \right]^{\frac{1}{3}} = (A\cos^2 \theta + B\sin^2 \theta)^{\frac{1}{3}} + (A\sin^2 \theta + B\cos^2 \theta)^{\frac{1}{3}} $$
Trivially, if $A=B$ then identically $J = 2A^{\frac{1}{3}}$, and if $A=-B$ then identically $J = 0$. More generally, assuming (say) $A \gt B$:
$$ J = ((A-B)\cos^2 \theta + B)^{\frac{1}{3}} + (A + (B-A)\cos^2 \theta)^{\frac{1}{3}} $$
and we can replace $x = (A-B)\cos^2 \theta \in [0,A-B]$:
$$ J = (x + B)^{\frac{1}{3}} + (A - x)^{\frac{1}{3}} $$
Applying the standard optimization technique of (univariate) calculus, the value of $J$ at the endpoints is the same, $A^{\frac{1}{3}} + B^{\frac{1}{3}}$. Taking the derivative:
$$ \frac{dJ}{dx} = \frac{1}{3} (x+B)^{\frac{-2}{3}} - \frac{1}{3} (A-x)^{\frac{-2}{3}} $$
we find a critical point where $|x+B|=|A-x|$, i.e. midpoint $x = \frac{A-B}{2}$. There:
$$ J\bigg\rvert_{x=\frac{A-B}{2}} = 2\left(\frac{A+B}{2}\right)^{\frac{1}{3}} $$
That said, we can determine by inspection whether this critical point is a relative minimum or a relative maximum. Consider the derivative of $J$ at $x=0$. If negative, then the critical point is our minimum; otherwise the endpoints' value will do. Moreover:
$$ \frac{dJ}{dx}\bigg\rvert_{x=0} = \frac{1}{3}\left(B^{\frac{-2}{3}} - A^{\frac{-2}{3}}\right) $$
so the function $J$ is decreasing at $x=0$ if and only if $|A| \lt |B|$. corrected
For $n \gt 2$ this seems to be motivation for a greedy numerical method. Consider all pairs of orthonormal basis vectors and find the pair $q_i$ and $q_j$ whose mutual adjustment gives the greatest decrease in $J$. Stop when no pair adjustment gives a decrease.
Concretely, suppose that a current selection of orthonormal basis vectors $\{q_i\}$ is represented by the columns of an orthogonal (or unitary) matrix $Q$. Adjustment can be made to only two columns at one time by supplying a Givens rotation $R$ to multiply by on the right, $QR$.
Note that excluding reflections among orthogonal matrices, if we wish, is not really an issue because replacing basis vector $q_i$ by $-q_i$ does not affect the value of $J$.