I'm taking Strang's linear algebra OCW with the textbook and met this proposition.
First, notations. ${}^{*}$ is taking (the usual operation for complex matrices) transpose conjugate, and ${}^{T}$ is taking transpose without transpose, s.t. $M^{T}_{i, j} = M_{j,i}$ and $M^{*}_{i,j} = \bar{M_{j, i}}$. Lower-triangular matrices, be it invertible or not, are denoted by $L$, similarly upper-triangular $U$, and diagonal $D$. There's no invertibility restrictions; $\begin{bmatrix}0 & 0 \\ 0 & 7i\end{bmatrix}$ is complex lower-triangular, hence calling it $L$ is valid; it's upper-triangular so this matrix could also be $U$; it's diagonal so maybe under the name $D$ is this matrix. In particular the identity matrix is also at the same time $U$, $D$ and $L$. $\begin{bmatrix} 0 & 7 \\ 0 & 0\end{bmatrix}$ is a $U$; neither $D$ nor $L$. Notice $L^{*}$ is actually a $U$, similarly $U^T$ is actually $L$. P(S)D refers to positive-(semi-)definite, similaly N(S)D.
You may want to view such definition as saying that the set of all lower-triangular matrices i.e. those with entries above the main diagonal equal $0$ is $L$, and I abused the notations a bit such that $L$ appearing in a formula is a concrete lower-triangular matrix rather than a set.
Here's the question: Suppose complex-valued Hermitian $M$ has complex factorization $LDU$, i.e. a product of complex lower-triangular matrix, complex diagonal matrix, and complex upper-triangular matrix, is it guaranteed that there exists some complex lower-triangular $L^\prime$ and complex diagonal $D^\prime$ such that $M$ also equals $L^\prime D^\prime {L^\prime}^{*}$? What about the analogue in real-valued matrices? I.e. suppose real-valued symmetric $M$ has real factorization $M=LDU$, is existence of some real lower-triangular $L^\prime$ and some real diagonal $D^\prime$ such that $M=L^\prime D^\prime {L^\prime}^T$ guaranteed? Again, all matrices here have no invertibility/P(S)D restriction. It's just their "shape" i.e. being either upper triangular, lower triangular, or diagonal, matters.
For example, given
$\begin{bmatrix} 1 & \frac{4}{5} & \frac{-4}{5} \\ \frac{4}{5} & 1 & \frac{4}{5} \\ \frac{-4}{5} & \frac{4}{5} & \frac{1}{100} \end{bmatrix} = M = \begin{bmatrix}1 & 0 & 0 \\ \frac{4}{5} & 1 & 0\\ \frac{-4}{5} & 4 & 1 \end{bmatrix} I \begin{bmatrix}1 & \frac{4}{5} & \frac{-4}{5} \\ 0 & \frac{9}{25} & \frac{36}{25} \\ 0 & 0 & \frac{-639}{100} \end{bmatrix} = LDU$
viewing it as a complex problem, one may say let $L^\prime = \begin{bmatrix}1 & 0 & 0 \\ \frac{4}{5} & \frac{3}{5} & 0 \\ \frac{-4}{5} & \frac{12}{5} & \frac{3\sqrt{71}i}{10} \end{bmatrix}$ and $D^\prime = \begin{bmatrix}1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & -1 \end{bmatrix}$ such that
$L^\prime D^\prime {L^\prime}^{*} = \begin{bmatrix}1 & 0 & 0 \\ \frac{4}{5} & \frac{3}{5} & 0 \\ \frac{-4}{5} & \frac{12}{5} & \frac{3\sqrt{71}i}{10} \end{bmatrix} \begin{bmatrix}1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & -1 \end{bmatrix} \begin{bmatrix}1 & \frac{4}{5} & \frac{-4}{5} \\ 0 & \frac{3}{5} & \frac{12}{5} \\ 0 & 0 & \frac{-3\sqrt{71}i}{10} \end{bmatrix} =\begin{bmatrix} 1 & \frac{4}{5} & \frac{-4}{5} \\ \frac{4}{5} & 1 & \frac{4}{5} \\ \frac{-4}{5} & \frac{4}{5} & \frac{1}{100} \end{bmatrix} = M$
Below are my attempts trying to figure this out. First, I noticed this toy example:
$LDU = \begin{bmatrix}0 & 0 \\ 0 & 1\end{bmatrix} I \begin{bmatrix}1 & 1 \\ 0 & 1\end{bmatrix} =\begin{bmatrix}0 & 0 \\ 0 & 1\end{bmatrix} = M$
where at first glance is a contradiction in the real case, but I quickly realized it's not really a counter example since I could well re-write it as
$L^\prime D^\prime {L^\prime}^{T} = \begin{bmatrix}0 & 0 \\ 0 & 1\end{bmatrix} \begin{bmatrix}0 & 0 \\ 0 & 1\end{bmatrix} \begin{bmatrix}0 & 0 \\ 0 & 1\end{bmatrix} =\begin{bmatrix}0 & 0 \\ 0 & 1\end{bmatrix} = M$
since the proposition is about existential proof for $L^\prime$ and $D^\prime$; it's very likely $L^\prime \neq L$ and $D \neq D^\prime$ albeit they are both lower-triangular/diagonal.
I'm lacking in linear algebra; in an attempt to figure out (search for) the answer myself, I've checked Cholesky decomposition, but it cares only about the equivalence between that a Hermitian $M$ being PSD and that $M = LL^{*}$, which is a bit off here, since the proposition allowed an optional $D^{'}$ in-between, which could easily ruin the PSD argument in Cholesky decomposition. The first example shows this, where $M=\begin{bmatrix} 1 & \frac{4}{5} & \frac{-4}{5} \\ \frac{4}{5} & 1 & \frac{4}{5} \\ \frac{-4}{5} & \frac{4}{5} & \frac{1}{100} \end{bmatrix}$ is not PSD.
Wikipedia page on Cholesky has section dedicated to $LDL^{*}$, and notes that it's indeed possible for matrices that does not have Cholesky decomposition (i.e. not PSD) to be written as $LDL^{*}$, but didn't mention if that's a universal fact.
I also checked the common argument that $\begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix}$ could not be written as $LDU$ let alone $L^{'}D^{'}{L^{'}}^{*}$, but that's also not the point here, since one of the premise for $M$ is that it's $LDU$ to begin with. (In some sense we should consider adding some permutation matrix $P$ s.t. $P\begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix} = LDU$ and consider $M=P\begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix}$)
I'm only on chapter 2.7 of the aforementioned book; the proposition stems from this quote from Dr. Strang: "If $S=S^T$ is factored into $LDU$ with no row exchanges, then $U$ is exactly $L^T$". Given that his main focus in chapter 1 and 2 are often real invertible matrices, I assume he's talking about real invertible matrices, which is immediate given that $LDU$ is unique for invertible matrices if such factorization exists (for complex just use ${}^{*}$ instead and similar argument follows), but I'm wondering if the proposition still holds sans the invertible criteria.