How to prove that this block-matrix is positive-definite?

176 Views Asked by At

I have a $3n\times3n$ symmetric block matrix that I need to prove is positive definite: $$ M = \left(\begin{array}{ccc} M_{1,1}&\dots&M_{1,n}\\ \vdots&\ddots&\vdots\\ M_{n,1}&\dots&M_{n,n}\\ \end{array}\right). $$ Given a set of points $\{ \mathbf{x}_i \}_{i=1}^n$ define $\mathbf{x}_{ij} = \mathbf{x}_{i}-\mathbf{x}_{j}\in\mathbb{R}^3$ as the vector joining the $i$th and $j$th point then $$M_{ij} = (d_{ij}+\frac{\epsilon^2}{2}d_{ij}^3)I+(d_{ij}^3-\frac{3\epsilon^2}{2}d_{ij}^5)\mathbf{x}_{ij}\mathbf{x}_{ij}^T$$ is a $3\times3$ symmetric positive definite(*) matrix, $I$ the 3D identity matrix, $d_{ij} = (\mathbf{x}_{ij}^T\mathbf{x}_{ij}+\epsilon^2)^{-1/2}$ is a positive definite scalar function and $0<\epsilon<<1$ is a small positive parameter.

I have a lot of numerical evidence(**) to believe that the this matrix is positive definite for any reasonable choice of $\epsilon$ and any distribution/number of points $\{ \mathbf{x}_i \}_{i=1}^n$; however, a general proof eludes me. Any ideas on how to prove that this matrix is positive definite?

If this is easy for you: what about the case where $\epsilon=\epsilon_j$ depends on $j$ but not $i$, such that $M_{ij}\ne M_{ji}$? (Although $M_{ij}^T=M_{ij}$ still)

If it turns out that this matrix is not positive definite in general, are there conditions on the distribution of points that guarantee that it will be positive definite?

Thank you in advanced!

(*) We know $M_{ij}$ is SPD as the eigenvalues, which can be easily calculated with MAPLE, are all positive and real.

(**) The eigenvalues, which are numerically calculated in MATLAB, asymptotically approach zero from above as $\epsilon$, $1/n$ and $||\mathbf{x}_{ij}||$ all approach zero.

1

There are 1 best solutions below

0
On

Some thoughts on the problem:

The matrix $A$ with blocks $A_{ij} = d_{ij} I$ can be written as $A = D \otimes I$, where $D$ has entries $d_{ij}$ and $\otimes$ is a Kronecker product. The matrix $B$ with blocks $B_{ij} = d_{ij}^3\mathbf x_{ij}\mathbf x_{ij}^T$ is given by $$ B = \sum_{i,j} d_{ij}^3 (\mathbf e_i\mathbf e_j)^T \otimes (\mathbf x_i - \mathbf x_j)(\mathbf x_i - \mathbf x_j)^T \\ = \sum_{i,j} d_{ij}^3 [\mathbf e_i \otimes (\mathbf x_i - \mathbf x_j)][\mathbf e_j \otimes (\mathbf x_i - \mathbf x_j)]^T \\ = [\mathbf e^T \otimes \mathbf X - \mathbf X \otimes \mathbf e^T] \operatorname{diag}(\operatorname{vec}(D)) [\mathbf e^T \otimes \mathbf X - \mathbf X \otimes \mathbf e^T]^T, $$ where $\mathbf X$ is the matrix with columns $\mathbf x_i$, $\mathbf e_i \in \Bbb R^n$ denotes the $i$th standard basis vector, $\mathbf e = (1,\dots,1)^T \in \Bbb R^n$, and vec denotes the vectorization operator.

The limit of $M$ as $\epsilon \to 0^+$ is $A + B$.