I recently learned about the multivariate Gaussian distribution, and I saw a formula derivation in the literature where I do not know how to simplify the log-likelihood from
$$-\frac{K}{2}\log |\Sigma|-\frac{1}{2}\sum\limits_{i=1}^K (y_i-\mu)^T\Sigma^{-1}(y_i-\mu)$$
to
$$-\frac{K}{2}\left(\log |\Sigma| + \text{tr}(\Sigma^{-1}\bar{\Sigma}) + (\mu - \bar{\mu})^T\Sigma^{-1}(\mu - \bar{\mu})\right),$$
where $\bar{\mu}=\frac{1}{K}\sum\limits_{i=1}^K y_i$ and $\Sigma^{-1}=\frac{1}{K}\sum\limits_{i=1}^K (y_i - \bar{\mu})(y_i - \bar{\mu})^T$.
The link to the page in the book can be found here or seen here:

This follows analogously from the one-dimensional case where we simply add zero to the summand:
$$\sum\limits_{i=1}^K (x_i-\mu)^2=\sum\limits_{i=1}^K (x_i-\bar{x}+\bar{x}-\mu)^2=\sum\limits_{i=1}^K (x_i-\bar{x})^2+n(\bar{x} - \mu)^2$$
as $2\sum (x_i-\bar{x})(\bar{x}-\mu)$ vanishes because $\sum (x_i-\bar{x})=0$.
If we extend this logic to the scenario above, we have denoting $\bar{\mu}$ as $K^{-1}\sum y_i$ that
$$\sum\limits_{i=1}^K \left(y_i-\bar{\mu}+\bar{\mu}-\mu\right)^T\Sigma^{-1}\left(y_i-\bar{\mu}+\bar{\mu}-\mu\right)$$
$$=\sum\limits_{i=1}^K (y_i-\bar{\mu})^T \Sigma^{-1}(y_i-\bar{\mu})+ (y_i-\bar{\mu})^T\Sigma^{-1}(\bar{\mu} - \mu) + (\bar{\mu} - \mu)^T\Sigma^{-1}(y_i-\bar{\mu})+(\bar{\mu} - \mu)^T\Sigma^{-1}(\bar{\mu} - \mu).$$
From here we notice the terms $(\bar{\mu} - \mu)$ do not depend on the index, so we simply have $K$ of them. Then we can also see that since $Ax = (x^TA^T)^T$, that
$$(y_i-\bar{\mu})^T\Sigma^{-1}(\bar{\mu} - \mu)= \left((\bar{\mu} -\mu)^T\left(\Sigma^{-1}\right)^T(y_i-\bar{\mu})\right)^T$$
so then
$$\sum\limits_{i=1}^K (y_i-\bar{\mu})^T\Sigma^{-1}(\bar{\mu} - \mu)=\sum\limits_{i=1}^K \left((\bar{\mu} -\mu)^T\left(\Sigma^{-1}\right)^T(y_i-\bar{\mu})\right)^T$$
$$=\left(\sum\limits_{i=1}^K (\bar{\mu} -\mu)^T\left(\Sigma^{-1}\right)^T(y_i-\bar{\mu})\right)^T=\left((\bar{\mu} -\mu)^T\left(\Sigma^{-1}\right)^T\sum\limits_{i=1}^K (y_i-\bar{\mu})\right)^T=0^T=0.$$
Similarly the second sum with cross terms vanishes when we pull out the covariance and the expression $(\bar{\mu} - \mu)$. Then we have
$$\sum\limits_{i=1}^K (y_i-\mu)^T\Sigma^{-1}(y_i-\mu)=\sum\limits_{i=1}^K (y_i-\bar{\mu})^T\Sigma^{-1}(y_i-\bar{\mu}) + K(\bar{\mu} - \mu)^T\Sigma^{-1}(\bar{\mu} - \mu).$$
We can simplify the first summand to the desired result by noting that
$$\sum\limits_{i=1}^K (y_i-\bar{\mu})^T\Sigma^{-1}(y_i-\bar{\mu})=\text{tr}\left((y-\bar{\mu})^T\Sigma^{-1}(y-\bar{\mu})\right)=\text{tr}\left(\Sigma^{-1}(y-\bar{\mu})(y-\bar{\mu})^T\right)$$
$$=\text{tr}\left(\Sigma^{-1}K\bar{\Sigma}\right)=K\text{tr}(\Sigma^{-1}\bar{\Sigma})$$
where the text defined $\bar{\Sigma}=K^{-1}\sum\limits_{i=1}^K (y_i-\bar{\mu})(y_i-\bar{\mu})^T$.