This question has arisen by following the proof in the appendix of Louis Liporace's paper on maximum-likelihood estimation, where the paper concerns classes of probabilistic functions (elliptically symmetric) of Markov chains for which known parameter re-estimation formulas exist.
In order to show that any critical point of a certain auxiliary function is a local maxima, the second derivative is calculated. There are two terms that need differentiating for which the procedure is not immediately apparent to me. These two terms are as follows.
$\frac{1}{2}\log|\bar{C}|-\frac{1}{2v_t^2}(O_t-\bar{m})^T\bar{C}(O_t-\bar{m})$
Liporace uses convex combinations, where $0<\theta<1$, with $\bar{C}=\theta C^1+(1-\theta)C^2$ and $\bar{m}=\theta m^1+(1-\theta)C^2$ to make transformations.
After differentiating twice, this should lead to the corresponding terms
$\frac{1}{2}\sum_{i=1}^d\frac{(x_i^1-x_i^2)^2}{(\theta x_i^1+(1-\theta)x_i^2)^2} +\frac{2}{v_t^2}(m^1-m^2)^T(C^1-C^2)\cdot[O_t-(\theta m^1+(1-\theta)m^2)]+v_t^{-2}(m^1-m^2)^T(\theta C^1+(1-\theta)C^2)(m^1-m^2),$
where $x_i^1$ and $x_i^2$ are the diagonal entries of $UC^1U^T$ and $UC^2U^T$ respectively and $U$ is the orthogonal matrix diagonalising $\theta C^1 +(1-\theta)C^2$.
For the first term, I can certainly see that a reasonable first step is to use the transformation to diagonalise $\bar{C}$ as $|\bar{C}|=|U\bar{C}U^T|$. I can derive something very similar to the first term and am sure that we are using the fact that the determinant of a diagonal matrix equals the product of its diagonal entries, but this still does not seem to quite lead to the right answer. I am a thrown a little as to why a sum rather than a product is occurring in the term.
Any specific help with how to proceed with the first or the second term or general hints as to how to handle a differential involving determinants would be much appreciated.
The fist term is a sum instead of a product because you can use $ \log(ab) = \log(a) + \log(b) $ to write $ \log \det \bar{C} = \sum_i \; \log \left(\theta x_i^1 + (1-\theta) x_i^2 \right) $. On the other hand, it is not at all clear to me why $ U $ should simultaneously diagonalize both $ C^1 $ and $ C^2 $.
I think you have also made a typo, and that $ \bar{m} = \theta m^1 + (1- \theta) m^2 $. For the second term, it is straightforward to work out the product rule as follows. Writing for short $ A = (O_t-\bar{m})$ and $ \frac{d}{d\theta} M = \dot{M} $ then
$$ \frac{d^2}{d\theta^2}(A^T C A) = \frac{d}{d\theta} \; \left(\dot{A}^T C A + A^T \dot{ C} A + A^T C \dot{A} \right) $$
Now you can use the fact that $ \frac{d^2}{d\theta^2} $ of any of the matrices is zero, so that $$ \frac{d}{d\theta} \; \left(\dot{A}^T C A + A^T \dot{ C} A + A^T C \dot{A} \right) = (\dot{A}^T \dot{C} A + \dot{A}^T C \dot{A}) + (\dot{A}^T \dot{C} A + A^T \dot{C} \dot {A}) + (\dot{A}^T C \dot{A} + A^T \dot{C} \dot{A}) \\ = 4 ( \dot{A}^T \dot{C} A ) + 2 \dot{A}^T C \dot{A} $$
Apparently, the matrices are symmetric, so that $ \dot{A}^T \dot{C} A = A^T \dot{C} \dot{A} $. This gives the answer that is given.