When is this matrix positive semi-definite?

191 Views Asked by At

I have a symmetric $n \times n$ matrix as follows. I want to find the eigenvalues of this Hessian matrix to state that it is not Positive Semi-Definite (i.e. some eigenvalues are negative while the others are non-negative). If I couldn't find all the eigenvalues, it would be enough for my purpose to find just one negative eigenvalue:

NOTE: I could have found the eigenvalues in the $2\times2$ and $3\times3$ case. I am looking for the general $n\times n$ case. $a_1$ and $a_2$ should be equal in order to have positive eigenvalues in $2\times 2$ case. Similarly, $a_1, a_2$ and $a_3$ should be equal in order to have positive eigenvalues in $3\times 3$ case. $$A=\begin{pmatrix} 2a_1&a_1+a_2&\dots &a_1+a_{n-1}&a_1+a_n\\ a_1+a_2&2a_2&\dots &a_2+a_{n-1}&a_2+a_n\\ \vdots&\vdots&&\vdots&\vdots\\ a_1+a_{n-1}&a_2+a_{n-1}&\dots&2a_{n-1}&a_{n-1}+a_n\\ a_1+a_n&a_2+a_n&\dots&a_n+a_{n-1}&2a_n \end{pmatrix}$$

So $A_{ij} = a_i+a_j$.

2

There are 2 best solutions below

0
On BEST ANSWER

The matrix $A$ can be decomposed as $A=au^T+ua^T$ where $a\in\mathbb{R}^n$ is the vector containing $a_1,\ldots,a_n$ and $u\in\mathbb{R}^n$ is the vector of all ones. If $v\in\mathbb{R}^n$ is a vector perpendicular to both $a$ and $u$, then $Av=0$. Hence we always have $n-2$ eigenvalues equal to zero. For the last two eigenvalues, we have to cases.

In the first case, $a$ and $u$ are linearly dependent, i.e., $a=\alpha u$ for some $\alpha\in\mathbb{R}$: in this case $A=2\alpha uu^T$ and the last two eigenvalues are $0$ and $2\alpha n$. The associated eigenvectors are $\frac{u}{||u||}$ (for the non-zero eigenvalue) and any set of $n-1$ vectors orthogonal to $u$. All the eigenvalues are non-negative and the matrix is PSD.

In the second case, $a$ and $u$ are linearly independent. For our analysis, we can limit our attention, without loss of generality, to vectors $v\in\mathbb{R}^n$ in the span of $\{a,u\}$ (if this is not true, the component of $v$ that is orthogonal to both $a$ and $u$ cancels out, as discussed above). This means that there exist $\alpha,\beta\in\mathbb{R}$ such that $v=\alpha a+\beta u$. The eigenvalue equation $Av=\lambda v$ reduces to, on the LHS,

$$ Av=(au^T+ua^T)(\alpha a+\beta u)=\left((u^Ta) \alpha+(u^Tu)\beta\right)a +\left((a^Ta)\alpha+(a^Tu)\beta\right) u, $$

and on the RHS

$$ \lambda v= \lambda\alpha a +\lambda\beta u. $$

Since $a$ and $u$ are linearly independent, the equation has a solution iff the coefficients of $a$ and $u$ on the two sides match.

We therefore get a reduced $2 \times 2$ system of equations of the form

$$ \begin{bmatrix} u^Ta & ||u||^2\\||a||^2 & u^Ta \end{bmatrix}\begin{bmatrix}\alpha\\\beta\end{bmatrix}=\lambda \begin{bmatrix}\alpha\\\beta\end{bmatrix}. $$

Note that $||u||^2=n$. Intuitively, this makes sense, because we reduced our $n\times n$ eigenvalue problem to the $2 \times 2$ eigenvalue problem obtained by restricting the vectors to the 2-dimensional space spanned by $a$ and $u$.

A symbolic computation software informs me that the eigenvalues for the reduced $2 \times 2$ case are

$\lambda_{1,2}=u^Ta\pm\sqrt{n||a||^2}.$

(Note that this formula gives the right result also for the case $a=\alpha u$, although, technically, its derivation would not be correct.)

To conclude, the matrix A can be PSD or indefinite depending on $a$. E.g., if $a=u$, then it is PSD, but if $a$ is orthogonal to $u$, then it is indefinite.

0
On

Note that $A$ can be represented as $$A=a^T\otimes e+e^T\otimes a,$$ where $a=(a_1,\ldots,a_n)$ and $e=(1,\ldots,1)$. Hence $A$ has rank $\leq 2$ and can have at most two non-zero eigenvalues.

The corresponding column eigenvectors are linear combinations of $a^T$ and $e^T$. Writing $w=\alpha a+\beta e$, we get $$Aw^T=\bigl(\alpha (e,a)+\beta(e,e)\bigr) a^T+\bigl(\alpha (a,a)+\beta(a,e)\bigr) e^T,$$ where $(x,y)=x\cdot y^T=\sum_{k=1}^n x_k y_k$ denotes the usual scalar product. Therefore, the equation for the two nontrivial eigenvalues becomes $$\operatorname{det}\left(\begin{array}{cc} (e,a)-\lambda & (e,e) \\ (a,a) & (a,e)-\lambda \end{array}\right)=0,$$ or, equivalently, $$\lambda^2-2(e,a)\lambda+(e,a)^2-(e,e)(a,a)=0.$$ Yet in another form: $$\lambda^2-2n\langle a\rangle\lambda+n^2\left(\langle a\rangle^2-\langle a^2\rangle\right)=0\quad \Longrightarrow \quad \lambda_{\pm}=n\left[\langle a\rangle\pm\sqrt{\langle a^2\rangle}\right],$$ where $\langle x\rangle\displaystyle=\frac{\sum_{k=1}^n x_k}{n}$. Therefore $A$ is positive semi-definite iff $a_1=a_2=\ldots=a_n\geq 0$.