I have matrices of the form:
\begin{equation} H_{BdG} = \begin{pmatrix}H&\Delta\\\Delta^{\dagger}&-H^T\end{pmatrix}, \end{equation}
where $H = H^{\dagger}$ and $\Delta^{\dagger} = \Delta^T = -\Delta$. The spectrum of $H_{BdG}$ is symmetric around zero: for every eigenvalue $E$ there is a corresponding $-E$.
The quantity I am interested in is the sum of the positive eigenvalues of $H_{BdG}$.
I want to calculate this value numerically for very large matrices. For my purposes it is very slow to directly diagonalize the matrix. I was hoping there could be a shorter way around, since I only care about the sum of the eigenvalues, not the individual ones.
I thought a possible solution (if this is even possible) could be using the trace of the matrix. The trace of $H_{BdG}$ is zero, but I was hoping there could be a smart transformation to eliminate the negative eigenvalues.
I noticed the quantity I want is equivalent to half the sum of the absolute value of all eigenvalues:
\begin{equation} \sum_{i, E_i >0} E_i = \sum_i \frac{\vert E_i\vert}{2}, \end{equation}
so maybe that's a start? I also noticed that if I wanted the sum of the squares of the positive eigenvalues, the solution would be simple: I just had to calculate the trace of $H_{BdG}^2$, since
\begin{equation} \sum_{i, E_i >0} E_i^2 = \sum_i \frac{E_i^2}{2}. \end{equation}
I don't know if this information might help, but the reason I want this quantity is because I want to calculate the derivative of this quantity with relation to a parameter of the matrix.
So, is this workaround even possible? If so, how?