I am studying the conditions of positive semi-definiteness of a $(n+1)\times(n+1)$ symmetric matrix $\mathbf{M}$ built in the following way:
$$
\mathbf{M}=\begin{pmatrix}
\mathbf{A} & \mathbf{b} \\
\mathbf{b}^T & c
\end{pmatrix}
$$
where $\mathbf{A}$ is a symmetrix $n\times n$ matrix, $\mathbf{b}$ is a $n$-dimensional column vector and $c$ is a real number.
The first $n$ leading principal minors of $\mathbf{M}$ are the leading principal minors of $\mathbf{A}$, so $\mathbf{A}$ should be positive semi-definite.
The last condition is $\det\mathbf{M}=|\mathbf{M}|\geq0$. By a simple calculation, I obtained
$$
|\mathbf{M}|=c|\mathbf{A}|-\mathbf{b}^T\mathbf{A}^*\mathbf{b}\geq0
$$
where $\mathbf{A}^*$ is the adjoint matrix of $\mathbf{A}$, i.e. the transpose of the matrix of cofactors.
This condition can be written
$$
c|\mathbf{A}|-\mathbf{b}^T\mathbf{A}^*\mathbf{b}=
\begin{cases}
|\mathbf{A}|\left(c-\mathbf{b}^T\mathbf{A}^{-1}\mathbf{b}\right), & \text{if }|\mathbf{A}|>0 \\
-\mathbf{b}^T\mathbf{A}^*\mathbf{b}, & \text{if }|\mathbf{A}|=0
\end{cases}
$$
So, when $|\mathbf{A}|>0$ the condition simply becomes
$$
c\geq\mathbf{b}^T\mathbf{A}^{-1}\mathbf{b}\geq0,
$$
given that $\mathbf{A}^{-1}$ is positive definite.
When $|\mathbf{A}|=0$ the condition becomes
$$
\mathbf{b}^T\mathbf{A}^*\mathbf{b}\leq0,
$$
so I am interested to know if $\mathbf{A}^*$ is positive semi-definite when $\mathbf{A}$ is positive semi-definite.
In the case $|\mathbf{A}|>0$, using spectral decomposition
$$
\mathbf{A}=\sum_{i=1}^n\lambda_i\mathbf{e}_i\otimes\mathbf{e}_i,
$$
where $\lambda_i$ are the eigenvalues and $\mathbf{e}_i$ the unit eigenvectors, so we have
$$
\mathbf{A}^*=|\mathbf{A}|\mathbf{A}^{-1}=\left(\prod_{k=1}^n{\lambda}_k\right)\sum_{i=1}^n\frac{1}{\lambda_i}\mathbf{e}_i\otimes\mathbf{e}_i = \sum_{i=1}^n\left(\prod_{k=1,k\neq i}^n{\lambda}_k\right)\mathbf{e}_i\otimes\mathbf{e}_i,
$$
so $\mathbf{A}^*$ is positive definite when $\mathbf{A}$ is, given that its eigenvalues are expressed as the product of eigenvalues of $\mathbf{A}$, excluded one in turn.
I suspect that this last expression represents $\mathbf{A}^*$ also when $|\mathbf{A}|=0$, probably by considering a positive semi-definite matrix with vanishing determinant as the limit of a positive definite matrix when one or more eigenvalues tends to zero.
So my questions:
- are my calculation correct?
- the last expression of $\mathbf{A}^*$ is valid also when $|\mathbf{A}|=0$?
- how can this be proved?
Yes, your equations are correct. Yes, the last expression you wrote is valid when $|A| = 0$. Note in particular that $\mathbf A^* = 0$ whenever the kernel of $\mathbf A$ has dimension at least $2$.
For a quick proof, we could simply note that both sides of the equation $$ \mathbf{A}^* = \sum_{i=1}^n\left(\prod_{k=1,k\neq i}^n{\lambda}_k\right)\mathbf{e}_i\otimes\mathbf{e}_i $$ are continuous functions of the entries of $\mathbf A$. If the equation holds for all strictly positive definite $\mathbf A$, then it must hold for positive semidefinite $\mathbf A$ "by continuity". In particular, if we define $\mathbf A_{\epsilon} = \mathbf A + \epsilon \mathbf I$ and $\lambda_{k}^{\epsilon}$ to be the $k$th eigenvalue of $\mathbf A_{\epsilon}$, then we can say that for a positive semidefinite $\mathbf A$ we have $$ \mathbf{A}^* = \lim_{\epsilon \to 0^+}\mathbf{A}_{\epsilon}^* = \lim_{\epsilon \to 0^+}\sum_{i=1}^n\left(\prod_{k=1,k\neq i}^n{\lambda}_k^{\epsilon}\right)\mathbf{e}_i\otimes\mathbf{e}_i = \sum_{i=1}^n\left(\prod_{k=1,k\neq i}^n{\lambda}_k\right)\mathbf{e}_i\otimes\mathbf{e}_i. $$
For a direct proof: we note that $\dim\ker \mathbf A \geq 2$ implies that $\mathbf A^* = 0$, which is positive semidefinite. For the case where $\dim\ker \mathbf A = 1$, we see that $\mathbf A$ is symmetric and $\mathbf A \mathbf A^* = 0$ implies that $\mathbf A^*$ has rank at most $1$, which means that $\mathbf A^*$ can be written in the form $\mathbf A^* = k \mathbf {xx}^T$ for some unit vector $\mathbf x$ and some $k \in \Bbb R$. We note that $k$ satisfies $\operatorname{tr}(\mathbf A^*) = k$.
With that, it suffices to note that $$ \operatorname{tr}(\mathbf A^*) = -\frac{d}{dt}|_{t = 0} \det(t\mathbf I - \mathbf A) = -\frac{d}{dt}|_{t = 0} (t - \lambda_1) \cdots (t - \lambda_n). $$