Sum over all possible combinations of a Cholesky decomposition

508 Views Asked by At

Suppose to have a $n \times n$ positive definite matrix $\boldsymbol{\Sigma}$ and let $ \boldsymbol{\Sigma}= \mathbf{B}\mathbf{B}^T$ where $\mathbf{B}$ is obtained with the Cholesky decomposition.

Let $\mathbf{S} = diag(s_1,s_2,\dots , s_n)$ be a diagonal matrix where each $s_i, i=1,\dots, n$ can assume only value $1$ or $-1$ and let $\mathcal{S}$ be the space of all possible values of $\mathbf{S}$.

Consider the matrix $\boldsymbol{\Sigma}_s = \mathbf{S} \boldsymbol{\Sigma} \mathbf{S} $ and its cholesky decomposition $\boldsymbol{\Sigma}_s= \mathbf{B}_s\mathbf{B}_s^T$.

Let $\mathbf{1}_n$ be a vector of n 1s. I am wondering if there is a easy way to compute

$\sum_{\mathbf{S} \in \mathcal{S}}\mathbf{B}_s \mathbf{1}_n$

EDIT 1

I compute the sum for a m $\times$ m matrix, with m=1,2,3, and if we let $\mathbf{D}$ be the vector of the diagonal element of $\mathbf{B}$, then $\sum_{\mathbf{S} \in \mathcal{S}}\mathbf{B}_s \mathbf{1}_m = 2^m \mathbf{D}$.

EDIT 2

I forgot to say that $\boldsymbol{\Sigma}$ is a covariance matrix

2

There are 2 best solutions below

0
On BEST ANSWER

I think i found the solution

We can write the matrix:

$\boldsymbol{\Sigma}_s = \mathbf{B}_s \mathbf{B}^t = \mathbf{S}\boldsymbol{\Sigma} \mathbf{S} = \mathbf{S}\mathbf{B}\mathbf{B}^t\mathbf{S}$

we can also write $\mathbf{S}\mathbf{B}\mathbf{B}^t\mathbf{S} = \mathbf{S}\mathbf{B}\mathbf{S}\mathbf{S}\mathbf{B}^t\mathbf{S} = \mathbf{B}_s \mathbf{B}^t$.

Then we have $\mathbf{B}_s = \mathbf{S}\mathbf{B}\mathbf{S}$. This is a valid Cholesky decomposition since $ \mathbf{S}\mathbf{B}\mathbf{S}$ is lower triangular and has all positive values in the diagonal.

Then $\sum_{\mathbf{S} \in \mathcal{S}} \mathbf{S}\mathbf{B}\mathbf{S} = 2^n diag(\mathbf{B}) $ (with diag() i mean a diagonal matrix with diagonal entries given by the diagonal of the argument) since each non diagonal elements of $\mathbf{S}\mathbf{B}\mathbf{S}$ is summed half the time with a positive sign and the other half with negative.

Is it true? can someone check if this is really the answer?

1
On

According to what you wrote, $\mathbf{B}_s = \mathbf{SB}$, so $$ u = \sum_{\mathbf{S}\in S} \mathbf{B}_s \mathbf{1}_n = \sum_{\mathbf{S}\in S} \mathbf{S} (B\mathbf{1}_n) $$ The quantity in parentheses does not change, call it $v$, so $$ u = \sum_{\mathbf{S}\in S} \mathbf{S}v$$ Over all $S$, if we consider each element of the resulting vector, half the possibilities for the entry will be $+1$ and the other half $-1$, so the sum should come out to zero.