Proving the conditions when my structured matrix fails to be invertible

60 Views Asked by At

(**Edit: Changed $\{\mathbb{R}|0 \leq R(\bf{x_i},\bf{x_j})\leq 1\}$ to $\{\mathbb{R}|0< R(\bf{x_i},\bf{x_j})\leq 1\}$)

I have a $N\times N$ matrix $\bf R$ with the following structure:

$\begin{bmatrix}1 & R(\bf{x_1},\bf{x_2}) & R(\bf{x_1},\bf x_3) & \ldots & R(\bf{x_{1}},\bf{x_{N}})\\ R(\bf{x_2},\bf{x_1}) & 1 & R(\bf{x_2},\bf{x_3}) & \ldots & R(\bf{x_2},\bf{x_{N}})\\ \vdots & \vdots & \vdots &\vdots &\vdots\\ R(\bf{x_N},\bf{x_1}) & R(\bf{x_{N}},\bf{x_2}) & R(\bf{x_N,x_3}) & \ldots & 1 \end{bmatrix}$

In this matrix, the function $R(\bf{x_i},\bf{x_j})$, called the correlation function, has the following properties:

Property 1: The domain of the correlation function consists of the tuples $(\bf{x_i},\bf{x_j}):\bf{x_i},\bf{x_j} \in R^2$

Property 2: The range of the correlation function is $\{\mathbb{R}|0< R(\bf{x_i},\bf{x_j})\leq 1\}$

Property 3: The commutative property, i.e., $R(\bf{x_i},\bf{x_j})$=$R(\bf{x_j},\bf{x_i})$

Property 4: $R(\bf{x_i},\bf{x_j})=$$1\iff \bf{x_i}=\bf{x_j}$

Based on these properties, I am trying to prove the following:

Proposition 1: For a correlation function satisfying properties 1-4, the $\bf R$ matrix is singular if and only if $\bf{x_i} = \bf{x_j}$ $: i,j\in [1,N], i \neq j$

I have so far found proposition 1 to be true based on numerical experiments, but I want to complement these with a more formal mathematical proof. (That may also reveal other conditions in which the R matrix could be singular, of which I am currently unaware).

Some preliminary observations: 1. By property 2, the matrix is positive 2. By property 3, the matrix is symmetric 3. By property 4, the main diagonal entries are all equal to 1.

My first approach was to try and show when the matrix fails to be full rank, based on an equation of the type:

$a_1\bf{R_{c1}}$ $+ a_2 \bf{R_{c2}}+\ldots$ $+a_N\bf{R_{cN}}=\bf 0$

where $\bf R_{ci}$ refers to the $i$th column of the $\bf R$ matrix and $i \in [1,N]$.

However, I failed to reduce a 3x3 example (N=3) that I was testing to Row Echelon form, while Maple also wasn't able to solve this homogeneous linear system.

That got me thinking: I have some more information that I'm maybe not really been using so far. My matrix is Hermitian. I did some searching and found that Sylvestre's criterion can be used to determine when a Hermitian matrix is positive definite, and thus invertible. By this criterion, if the leading minors have positive determinants, then the matrix is invertible. The first two leading minor determinants of the $\bf R$ matrix are:

  1. $1$
  2. $1-R(\bf{x_1},\bf{x_2})^2$

The first determinant is obviously positive. The second determinant is also positive in the entire domain, except for when $R(\bf{x_1},\bf{x_2})=1$, which, by property 4, implies that $\bf{x_1}$$ = \bf{x_2}$.

While that seems to some extent show what I want, there are some obvious flaws with this "proof":

  1. Positive definiteness, as far as I understand, is a sufficient but not a necessary condition for matrix invertibility. Therefore, my reasoning is not complete.

  2. I've only considered $(x_1,x_2)$, yet I'm trying to make a more general statement on $(x_i,x_j)$.

  3. Referring to point 2, the higher leading minors have quite complex determinant expressions. I'm not sure that I can show under which circumstances all of these determinants are positive.

I'm a relative novice at formal mathematical deduction. My formal training is in mechanical engineering, where the mathematics is typically more of the rote, applied kind. I am keenly aware of my limitations in this regard! I would therefore greatly appreciate the community's critique of my approach. Any assistance on how to proceed, or how to achieve this proof (which might, as far as I know and indeed hope, be very trivial) would be highly valued.

1

There are 1 best solutions below

1
On

Your proposition doesn't appear to be true. If $\bf{x_1}=\bf{x_N}$, and $R(\bf{x_i},\bf{x_j})=0$ for $1<i<N,1<j<N,i\neq j$ then you end up with

\begin{bmatrix} 1 & 0 & 0 & \cdots & 0 & 1\\ 0 & 1 & 0 & \cdots & 0 & 0\\ \vdots & \vdots & \vdots &\vdots &\vdots &\vdots\\ 0 & 0 & 0 & \cdots & 1 & 0\\ 1 & 0 & 0 & \cdots& 0 & 1 \end{bmatrix}

Which after applying the row operation $R_N\leftarrow R_N-R_1$ gets all zeroes in the bottom row, hence it is singular.