Positive definiteness of a symmetric matrix

476 Views Asked by At

Matrix $\mathbf{A}$ is an $n\times n$ square matrix with main diagonal $1,2,\dots,n$, secondary diagonals $1,2,\dots,n-1$, tertiary diagonals $1,2,\dots,n-2$, and so on. For example, if we take $n=5$ then $$\mathbf{A}=\begin{bmatrix} 1&1&1&1&1\\ 1&2&2&2&2\\ 1&2&3&3&3\\ 1&2&3&4&4\\ 1&2&3&4&5 \end{bmatrix}.$$ Clearly this real matrix is symmetric thus Hermitian, and therefore all its eigenvalues are real. I am trying to show that this type of matrix has all positive eigenvalues. Numerical experiments suggest that this is the case but I can't find a formal proof. Gershgorin circle theorem definitely won't help here and positive definiteness is basically an equivalent problem. So, how can I prove that every eigenvalue of $\mathbf{A}$ is positive?

3

There are 3 best solutions below

0
On BEST ANSWER

Let me copy my answer to the MathOverFlow question mentioned in the comment.


Let $B=(b_{ij})$ be the upper triangular matrix such that $b_{ij}= \begin{cases} 1, \textrm{if $i \leq j$}\\ 0, \textrm{else} \end{cases}$, then the matrix $A$ is equal to the product $B^{T}B$.


It follows immediately that $A$ is positive definite.

0
On

Apply Sylvester's criterion: a Hermitian matrix $M$ is positive definite if and only if each of its principal minors is positive. The $k$-th principal minor is the determinant of the upper left $k\times k$ 'corner' matrix obtained from $M$.

In this case, for each principal minor, it's very simple to use row reduction so that the determinant is that of an upper triangular matrix with diagonal entries equal to $1$. For instance, if we were calculating the $5$-th principal minor of some $A$ with $n\geqslant 5$ we'd have

$$\begin{bmatrix} 1&1&1&1&1\\ 1&2&2&2&2\\ 1&2&3&3&3\\ 1&2&3&4&4\\ 1&2&3&4&5 \end{bmatrix} \implies \begin{bmatrix} 1&1&1&1&1\\ 0&1&1&1&1\\ 0&1&2&2&2\\ 0&1&2&3&3\\ 0&1&2&3&4 \end{bmatrix} $$

by subtracting the first row from the rows below it. Proceeding,

$$\begin{bmatrix} 1&1&1&1&1\\ 0&1&1&1&1\\ 0&1&2&2&2\\ 0&1&2&3&3\\ 0&1&2&3&4 \end{bmatrix} \implies \begin{bmatrix} 1&1&1&1&1\\ 0&1&1&1&1\\ 0&0&1&1&1\\ 0&0&1&2&2\\ 0&0&1&2&3 \end{bmatrix} $$

by subtracting the second row from the rows below it. And so on and so forth.

0
On

It's wise to try and use the symmetry of this matrix.

Recall that elementary row/column operations preserve the determinant

Consider subtracting the first column from each other column.

We then get $$\begin{bmatrix} 1&0&0&0&0\\ 1&1&1&1&1\\ 1&1&2&2&2\\ 1&1&2&3&3\\ 1&1&2&3&4 \end{bmatrix}.$$

Then, subtracting the first row from each row gives a block matrix where the lower right block is the same matrix but with one less dimension. Now apply induction.