Page 98 of "Kevin Patrick Murphy. Machine Learning: A Probabilistic Perspective." gives this equation to compute the Mahalanobis distance
where $u_i$ is the i’th column of U, containing the i’th eigenvector.
I am trying to understand that equation and reproduce this figure
which is visualization of a 2 dimensional Gaussian density. The major and minor axes of the ellipse are defined by the first two eigenvectors of the covariance matrix, namely $u_1$ and $u_2$. Based on Figure 2.7 of (Bishop 2006a).
what I already have is all the coordinates (x1, x2) of each point.
such as $\{x_1 = (-2.5, -1.5), x_2 = (-2.65, -1.75), ..., x_n\}$
is it possible to compute all the parameters? $\mu$, $\lambda$, u, $y_i$
note:
we see that the Mahalanobis distance corresponds to Euclidean distance in a transformed coordinate system, where we shift by μ and rotate by U

