Let $A \in \mathbb{R}^{n \times n}$. It is well-known that $\text{tr}(A)$ is equal to the sum of the eigenvalues of $A$.
Let us know restrict $A$ to being positive semi-definite. Obviously, it is still the case that $\text{tr}(A) = \lambda_1 + \lambda_2 + \cdots + \lambda_n$, where the $\lambda_is$ are the eigenvalues of $A$, since that was a general result. However, is there an easier way to show it, if we restrict $A$ to being positive semi-definite? I can't find a more elegant proof, other than the general one that applies to any square matrix.
I'm not sure what is the general proof you have in mind, but if we choose any orthonormal basis $v_1, \ldots, v_n$ for $\mathbb{R}^n$ (with respect to the standard inner product $\left< \cdot, \cdot \right>$) then
$$ \mathrm{tr}(A) = \sum_{i=1}^n \left< Av_i, v_i \right>. $$
If $A$ is symmetric, then by choosing $v_1, \ldots, v_n$ to be an orthonormal basis of eigenvectors of $A$ (with $Av_i = \lambda_i v_i$), you immediately get
$$ \mathrm{tr}(A) = \sum_{i=1}^n \left< Av_i, v_i \right> = \sum_{i=1}^n \left< \lambda_i v_i, v_i \right> = \sum_{i=1}^n \lambda_i. $$