Assume that $X$ and $A$ are $n\times n$ real matrices, with $A$ invertible (in fact, the identity matrix up to a non-zero factor).
Can we derive conditions on $X$ such that both $X+A$ and $X-A$ are invertible?
Conversely, knowing that both $X+A$ and $X-A$ are invertible, does this provide some information on $X$?
I tried to decompose the inverses with known formulas (Henderson-Searle, Woodbury) but it seems that my attempts only lead to circular arguments.
Write $\,A= \lambda\cdot\mathbb{1}\,$ for the fixed matrix, with $\mathbb 1$ denoting the $n\times n$ identity matrix and $\lambda$ a scalar factor. The condition "$X+A$ and $X-A$ are invertible"
is then equivalent to both $\,-\lambda\,$ and $\,\lambda\,$ being not eigenvalues of $X$.
(And this does not depend on $\lambda\ne 0$ or not.)
Another equivalent formulation is, by definition:
$\{\lambda, -\lambda\}$ lies in the resolvent set of $X$. The latter is the complement within $\mathbb C$ of the spectrum $\sigma(X)$.
This is all that can be said due to the following facts:
You may consider a diagonal matrix containing all the numbers from the given set.
Given such a set you may obtain a corresponding block-diagonal matrix by setting in the diagonal all the real elements and for each conjugate pair $\,a\pm ib\,$ a submatrix $\left(\begin{smallmatrix}a&b\\ -b&a\end{smallmatrix}\right)$; note
that $\,\begin{pmatrix} a&b\\ -b&a\end{pmatrix} \begin{pmatrix} 1\\ \pm i\end{pmatrix} = (a\pm ib) \begin{pmatrix} 1\\ \pm i\end{pmatrix}\,$.