I consider the following equation, where $M\in\mathbb R^{N\times N}$ is the unknown matrix, assumed to be symmetric: $$\sum_{J\subseteq [1,N]}\left(M-I_J\right)^{-1}=0$$ where $[1,N]$ is the set of integers between $1$ and $N$ and $I_J$ is the $N\times N$ diagonal matrix with ones in positions in $J$ and $0$ elsewhere.
A simple solution of this equation is $M=\frac{1}{2}I$ ($I$ is the identity matrix). My conjecture is that it is the only solution. The conjecture is true when $N=2$. In addition, one can see that any solution $M$ must have all its diagonal entries equal to $1/2$ (pre-multiply the equation by $I-M$, post-multiply it by $M$ and identify the diagonal entries).
A probabilistic way of looking at this equation (which is what actually led me to that equation) is to consider a sequence of $N$ i.i.d. Bernoulli random variables $X_1,\ldots,X_N$ with parameter $1/2$ and define the diagonal matrix $S_X=\textsf{Diag}(X_1,\ldots,X_N)$. Then, the equation rewrites as $$\mathbb E\left[(M-S_X)^{-1}\right]=0.$$
I am trying to prove (or disprove) my conjecture. Thank you in advance for your insights !