Let $A$ be a fixed matrix $n\times n$ matrix, and let $B_k$ be a sequence of $n\times n$ matrix such that $B_k$ converges to the identity matrix $I$. Suppose you have established this rate of convergence under the spectral norm (or any other matrix norm that is convenient).
Question: how is the rate of convergence related to the rate of convergence of $\text{tr}(AB_k)\to \text{tr}(A)$?
Let $A = U \Sigma V^*$ be the singular value decomposition of $A$. Then $$|\text{tr}(A (B_k - I))| = |\text{tr}(\Sigma V^* (B_k - I) U) | \le \text{tr}(\Sigma) \|B_k - I\|$$
where $\text{tr}(\Sigma)$ is the sum of the singular values of $A$. Moreover, this is best possible (it's an equality if $V^*(B_k-I)U$ is a multiple of the identity).