The Kullback-Leibler divergence between two n - variate normal distribution say $ F_{1} = N_{n} (0, C_{1}+\sigma^{2}I)$ and $ F_{2} = N_{n} (0, C_{2}+\sigma^{2}I)$ is $$ KL(F_{2}: F_{1}) = \frac{1}{2}\left[\text{tr}(C_{2}^{-1}C_{1}) - n - \text{log}{{\text{(det}}(C_{2}^{-1}C_{1}))}\right]$$ Suppose that following are met:
$C_{1}$ is a full matrix with dimension $n * n $ and can be decompose as $UDU'$ where $ U_{n*n}$ orthogonal matrix and $D$ is a diagonal matrix.
$C_{2}$ is a a best approximation matrix with dimension $m * m $ and can be decompose as $E D E'$ where $ E_{m*n}$ and $D$ is a diagonal matrix.
I know that $ C_{1} - C_{2} = U_{(n-m)}L U'_{(n-m)} $ and if $ || C_{1} - C_{2}||_F< \epsilon $ then I need to find the upper bound for $KL(F_{2}: F_{1})$.
This was how I proceeded and got stuck and will need a good direction to continue. First consider $ \text{tr}(C_{1}^{-1}C_{2}) - n $ \begin{align*} tr(C_{2}^{-1}C_{1}) - n &= tr(C_{2}^{-1}(C_{1}-C_{2}))\\ &= tr(ED^{-1}E' U_{(n-m)}L U'_{(n-m)})\quad \text{using the property of orthogonality for $E, U $}\\ &= tr(D^{-1}L) \\ &= \sum_{i=1}^{m}\sum_{j=1}^{n} d_i l_j tr(C_{2}^{-1}C_{1}) - n \\ &\leq tr(C_{2}^{-1}(C_{1}-C_{2}))\\ &\leq ||C_2^{-1}||_{max} \sum_{i=1}^{m}\sum_{j=1}^{n} d_i l_j \\ &\leq ||C_2^{-1}||_{max} n^{2}\epsilon\\ &\leq \left(\frac{n}{\sigma}\right)^2 \epsilon \end{align*}
The problem I have is, is the third and fourth line of the equation valid? Once , I know this is correct I will proceed with $ \text{log}{{\text{(det}}(C_{2}^{-1}C_{1}}))$.