Suppose I have the matrix \begin{align} F_s&=\begin{bmatrix} F_{1s} & G \\ G^T & F_{2s} \end{bmatrix} \end{align} where $F_{1s},F_{2s}$ are symmetric matrices that are also negative definite. $F_s$ is negative definite if and only if the following matrix inequality holds: \begin{align} F_{2s}<G^TF^{-1}_{1s}G. \end{align}
My Question
Suppose that I replace $F_s$ with \begin{align} F&=\begin{bmatrix} F_{1s} & G_1 \\ G_2 & F_{2s} \end{bmatrix} \end{align} Is $F$ negative deinite if and only if the following matrix inequality holds: \begin{align} F_{2s}<G_2F^{-1}_{1s}G_1? \end{align}
Background
This question occurred to me when reading the paper On Partial Contraction Analysis for Coupled Nonlinear Oscillators section 2.2 (it is not necessary to read this section I am just giving some background). I am trying to solve a much larger problem that would be made significantly easier if $F_{2s}<G_2F^{-1}_{1s}G_1$ implied $F$ is negative definite.
Attempt
Because general properties of matrices are outside my field of expertise my first idea was to find a counter example. My best idea for a counter example is \begin{align} \begin{bmatrix} -1 & a \\ b & -1 \end{bmatrix} \end{align} where $a$ and $b$ are scalars. The inequality $F_{2s}<G_2F^{-1}_{1s}G_1$ becomes \begin{align} -1&<b(-1)a\\ 1&>ab \end{align} As the eigenvalues of the matrix are $\lambda_{1,2}=-1\pm\sqrt{ab}$, the matrix is negative definite if and only if $1>ab$. So the statement holds.
Obviously this is not a proof but I cannot think of how to prove the general case. A hint or help would be appreciated.