Let $A\in\mathbb{R}^{n\times n}$ a square matrix and set $B=J_n-I_n-A$ where $I_n, J_n\in\mathbb{R}^{n\times n}$ are the identity matrix and matrix full of $1$s respectively. Consider
$$C=\begin{pmatrix}A & I_n\\ I_n & B\end{pmatrix}$$
What can we say about the determinant and the spectrum of $C$ if we know it of $A$ (and maybe $B$)? Can we make our results more precise if we assume that $A$ is a symmetric matrix that consists of $0$s and $1$s only and has only $0$s in the diagonal (i.e. is an adjacency matrix of a graph)?
I came up with a graph operation and I'm playing with it, seeing what happens. Unfortunately I'm not so good with the algebraic side of it and really not used to block matrices, so I don't the relevant theorems.
PS: If you have seen this or a similar construction before please leave a reference in the comments.
A partial result :
You can check that one can "reduce-by-blocks" matrix $C$ in the following way :
$$\begin{pmatrix}I_n & 0\\ I_n & -A\end{pmatrix}\begin{pmatrix}A & I_n\\ I_n & B\end{pmatrix}\begin{pmatrix}I_n & I_n\\ 0 & -A\end{pmatrix}=\begin{pmatrix}A & 0\\ 0 & A(BA-I_n)\end{pmatrix}\tag{1}$$
Taking determinants of both sides of (1) gives (using the fact that the determinant of a triangular-by-blocks matrix is the product of the determinants of its diagonal blocks):
$$\det(-A)\det(C)\det(-A)=\det(A)\det(A)\det(BA-I_n)$$
Therefore, if $\det(A) \neq 0$ :
Remark : If $A$ is a symmetric matrix, so is matrix $C$. Therefore, its eigenvalues will be real.