Compute the divergence of $n(x) = -\frac{\Sigma^{-1}x}{\|\Sigma^{-1}x\|}$

76 Views Asked by At

$\Sigma$ is a $n\times n$ positive definite matrix and I have the function $$ n(x) = -\Sigma^{-1}x $$ I define another function called $m(x) = n(x) / \|n(x)\|$ where the denominator is the usual L2 norm. I would like to find the divergence of this new function, i.e. find a closed form expression for $$ k(x) = \text{div} \,m(x) $$

2

There are 2 best solutions below

0
On BEST ANSWER

We can see that

$$k(x)=\sum_{i=1}^{n} \partial_{x_i}z_i$$

$$\partial_{x_i}n_j(x)=-\partial_{x_i}\sum_{k=1}^{n}(\Sigma^{-1})_{jk} x_k=-(\Sigma^{-1})_{ji}$$

and

$$\partial_{x_i}\|n(x)\|=\partial_{x_i}\sqrt{\sum_{j=1}^{n} n_j(x)^2}=\frac{\partial_{x_i}\sum_{j=1}^{n} n_j(x)^2}{2\sqrt{\sum_{j=1}^{n} n_j(x)^2}}=\frac{2\sum_{j=1}^{n}n_j(x) \partial_{x_i}n_j(x)}{2\|n(x)\|}=-\frac{\sum_{j=1}^{n}n_j(x) (\Sigma^{-1})_{ji}}{\|n(x)\|}=\frac{n_i(n(x))}{\|n(x)\|}$$

Hence we can see that

$$\partial_{x_i}m_j(x)=\partial_{x_i}\frac{n_j(x)}{\|n(x)\|}=\frac{\|n(x)\|\partial_{x_i}n_j(x)-n_j(x)\partial_{x_i}\|n(x)\|}{\|n(x)\|^2}=\frac{-(\Sigma^{-1})_{ji}\|n(x)\|-n_j(x)\frac{n_i(n(x))}{\|n(x)\|}}{\|n(x)\|^2}=\frac{-(\Sigma^{-1})_{ji}\|n(x)\|-n_j(x)\frac{n_i(n(x))}{\|n(x)\|}}{\|n(x)\|^2}=\frac{-(\Sigma^{-1})_{ji}}{\|n(x)\|}+\frac{-n_j(x)n_i(n(x))}{\|n(x)\|^3}$$

and so finally

$$\text{div}\,m(x)=\sum_{i=1}^n\partial_{x_i}m_i(x)=-\frac{\text{Tr}(\Sigma^{-1})}{\|n(x)\|}-\frac{n(x)\cdot n(n(x))}{\|n(x)\|^3}=\frac{(\Sigma^{-1}x)\cdot (\Sigma^{-2}x)}{\|\Sigma^{-1}x\|^3}-\frac{\text{Tr}(\Sigma^{-1})}{\|\Sigma^{-1}x\|}$$.

1
On

$ \def\S{\Sigma^{-1}} \def\l{\lambda} \def\L{\l^{-1}} \def\LL{\l^{-2}} \def\LLL{\l^{-3}} \def\n{\nabla} \def\d{\cdot} \def\LR#1{\left(#1\right)} \def\op#1{\operatorname{#1}} \def\divv#1{\n\d{#1}} \def\trace#1{\op{Tr}\LR{#1}} \def\frob#1{\left\| #1 \right\|} \def\qiq{\quad\implies\quad} $Consider an arbitrary vector $(n)$ and its length $\LR{\l=\frob n}$
Changes in these two quantities are of course related $$\eqalign{ \l^2 &= n^Tn \qiq \l\:d\l = n^Tdn \\ }$$ Construct the normalized vector $(m)$ and calculate how it changes $$\eqalign{ m &= \L n \\ dm &= \L dn - n\LL d\l \\ &= \L dn - n\LL \L n^Tdn \\ &= \L\LR{I-mm^T} dn \\ }$$ Now consider how these quantities change with respect to $x$ instead of $n$ $$\eqalign{ n &= -\S x \qiq dn = -\S dx \\ }$$ Substituting this into the previous result yields the gradient $$\eqalign{ dm &= \L\LR{mm^T-I}\S dx \\ \n m &= \L\LR{mm^T\S-\S} \\ }$$ The divergence equals the trace of the gradient, therefore $$\eqalign{ \n\d m &= \L\trace{mm^T\S-\S} \\ &= \frac{m^T\S m - \trace{\S}}{\frob{\S x}} \\ }$$