Positive definite matrix over a non-Archimedean field

544 Views Asked by At

Suppose $K$ is a non-Archimedean ordered field, and $A$ is an $n\times n$ square matrix over $K$ such that (1) all diagonal entries of $A$ are $1$, and (2) all off-diagonal entries of $A$ are infinitesimal (i.e. between $-\frac1k$ and $\frac1k$ for all nonzero natural numbers $k$). For instance, if $K=\mathbb{R}(x)$ is the field of rational functions, ordered with $x$ infinitesimal, then the off-diagonal entries of $A$ could be polynomials in $x$ with zero constant term.

Does it follow that $A$ is positive definite, i.e. that $V^{T} A V > 0$ (in the ordering of $K$) for all nonzero $V\in K^n$? This seems intuitive to me because it is an "infinitesimal deformation of the identity matrix", but I don't immediately see how to prove it.

4

There are 4 best solutions below

10
On BEST ANSWER

The following statement is valid over $\mathbb{R}$: if $(a_{ij})$ is symmetric and $a_{ii} > \sum_{j\ne i} |a_{ij}|$ for all $i$, then $(a_{ij})$ is positive definite. Now, if we have an "algebraic" statement valid over $\mathbb{R}$, it will be valid over any "real closed field". That statement is $\langle A x, x\rangle > 0$ for all $x \ne 0$. But your ordered field can be imbedded in a real closed field, so the statement is therefore true over any ordered field.

This is the philosophy... But probably the statement can be proved directly, without all this "meta" stuff..

$\bf{Added:}$ The dominant diagonal element criterion is sharp, as one can see looking at the eigenvalues of the matrix $(a_{ij}) = (1_{ij})$. But a weaker condition is enough, for instance $|a_{ij}|< \frac{1}{n-1}$ for all $i\ne j$. It is enough to add up all the inequalities: $$\frac{1}{n-1}\left (x^2_{i} + x^2_{j}\right) + 2 a_{ij} x_i x_j\ge 0$$ for $i<j$, and note that the inequalities are strict for non-zero variables.

$\bf{Added 2:}$. In fact diagonal dominant implies positive is quite simple. Just add all the inequalities; $$ |a_{ij}| x_{i}^2 + |a_{ij}| x_{j}^2+ 2 a_{ij} x_i x_j\ge 0$$ for all $i< j$ and get $$\sum_{i=1}^n s_i x^2_i + \sum_{i<j} 2 a_{ij} x_i x_j\ge 0$$ where $$s_i = \sum_{j\ne i} |a_{ij}|$$

$\bf{Added 3:}$ Let's also give a purely algebraic proof that diagonal dominant matrix ( by rows, $a_{ii} > \sum_{j\ne i} |a_{ij}|$ for all $i$) have determinant $>0$.

  1. The determinant cannot be $0$. Otherwise the system $A x = 0$ would have a non-zero solution. Get a contradiction, by considering the largest $|x_i|$.

  2. Deform the matrix to a matrix with positive determinant, while preserving dominance. The usual proof uses the intermediate value property for polynomials. We'll only use that property for polynomials of degree $1$, valid for every ordered field.

For this, consider for $t\in [0,1]$ the matrix $A_t$ that differs from $A$ only on first row, which is $(a_{11}, t a_{12}, \ldots, t a_{1n})$. We have $$\det A_t = (1-t) a_{11} \det A' + t \det A$$ where $\det A'$ is the determinant of the matrix $(a_{ij})_{2 \le i,j\le n}$. So we can do an induction argument. $n=1$ case is trivial. Assume true for $n-1$. Then we have $\det A'>0$. Therefore, $\det A_0 >0$. We know that $\det A_t \ne 0$ for $t \in [0,1]$ ( determinant of a dominant matrix). We conclude $\det A_t >0$ for all $t \in [0,1]$, and in particular, $\det A_1 = \det A >0$.

4
On

Assuming the matrix is symmetric (since only the symmetric part survives in $v^TAv$), one can apply Sylvester's criterion, using e.g. the Laplace expansion and induction: $1$, the first minor, is positive-definite. Supposing the $k$th is positive-definite and can be written as $1+\epsilon$ where $\epsilon$ is infinitesimal, then the Laplace expansion on the $k+1$ row (or column) of the $(k+1)$st minor gives an expression of the form $$ \det{A_k} = 1(1+\epsilon) + \epsilon' = 1+\epsilon'', $$ where the $\epsilon$s are all infinitesimal, since every element of the row apart from the last $1$ contains an infinitesimal.

5
On

I think we can extend the criterion $v^T A v > 0$ to function fields this way :

Let $K = \mathbb{R}(x)$. For some $A \in K^{n \times n}$, let $D$ be the diagonal matrix of roots of $\det(A-t I) \in K[t]$. Then each $D_{i,i}$ is in the algebraic closure $\overline{K}$. Let $L = K(D_{1,1} ,\ldots, D_{n,n})$.

The ordering on $K$ is $f \ge g$ if there exists $\epsilon > 0$ such that $\forall |c| \in (0, \epsilon), f(c) \ge g(c)$.

Thus $\forall v \in K^n, v^T Av > 0$ implies $A(c)$ is positive definite for $|c| \in (0, \epsilon)$ and hence $A(c) = P(c) D(c) P(c)^T$ and $D(c) > 0$, where the columns of $P(c)$ are some eigenvectors of $A(c)$.

Assuming the $D_{i,i}$ are distinct it is clear $ (A-D_{i,i} I)P_i = 0$ is a linear equation so that $P \in L^{n \times n}$. If the $D_{i,i}$ are are not distinct then $P \in L^{n \times n}$ stays true.

Finally $A(c) =P(c) D(c) P(c)^T$ on an interval implies (by analytic continuation) $A = P D P^T$ in $L$

(where $D(c) > 0$ only for $|c| \in (0,\epsilon)$, what about $D(c)$ for other $c$ ? Also does $D > 0$ make sense, ie. is $L$ ordered ?)

1
On

Here's how to use infinitesimals to make a simple argument. To fix a scale, note that it suffices to prove $V^T A V > 0$ just for those vectors $V$ whose largest component (by absolute value) is $1$.

Let $R$ be the subring of finite numbers and $M$ the (maximal) ideal of infinitesimal numbers. Note that all matrices and vectors involved have components in $R$.

Define $E = A - I$. Then,

$$ V^T A V = V^T V + (V^T E V) \equiv V^T V \pmod M $$

Since $V^T V \geq 1$, we conclude $V^T A V$ is positive.