Generalized Power Function for Positive Definite Matrices

77 Views Asked by At

The generalized power function with arguments square $p \times p$ matrix $Y$ and real vector $s$ of length $p$ is defined as $$ |Y|_s = |Y_{[1]}|^{s_1-s_2} |Y_{[2]}|^{s_2-s_3} \ldots |Y_{[p]}|^{s_p}, $$ where $|Y_{[j]}|$ is the determinant of the square matrix composed of the first $j$ rows and columns of Y (sometimes $|Y_{[j]}|$ is called the $j$th principal minor, I think). Note that for $s_1 = s_2 = \ldots = s_p$, $|Y|_s = |Y|^s$.

Now for $X$ being a real symmetric positive definite matrix I have seen in a paper (and it checks out numerically) that we can also write $$ |X|_s = \prod_{i=1}^{p} D_{ii}^{s_i}, $$ where $D$ comes from the unique decomposition $X=LDL'$, where $L$ is a lower triangular matrix with ones on the main diagonal and $D$ is a diagonal matrix with positive diagonal elements.

Why is $$ \prod_{i=1}^{p} D_{ii}^{s_i} = |X_{[1]}|^{s_1-s_2} |X_{[2]}|^{s_2-s_3} \ldots |X_{[p]}|^{s_p} $$ for positive definite $X$? I don't see it intuitively and am having difficulties trying to find the proof.

1

There are 1 best solutions below

0
On BEST ANSWER

If $$ X =LDL'=TT', $$ where $T$ is simply the lower triangular matrix with positive diagonal elements defined by $$ T = L*D^{1/2}, $$ then $$ X_{[j]} = T_{[j]}T_{[j]}= L_{[j]} D_{[j]} L_{[j]}'. $$ So $$ |X_{[j]}| = \prod_{i=1}^{j} D_{ii} $$ and thus $$ |X_{[j]}|/|X_{[j-1]}| = D_{jj}. $$

Finally $$ \prod_{i=1}^{p} D_{ii}^{s_i} = \prod_{i=1}^{p} (|X_{[i]}|/|X_{[i-1]}|)^{s_i} = |X_{[1]}|^{s_1-s_2} |X_{[2]}|^{s_2-s_3} \ldots |X_{[p]}|^{s_p}. $$