Suppose $A,B \in M_{n}(\Bbb{R})$ such that $A = \left[C_{1}\middle|\frac{I}{0\dots0}\right], B= \left[C_{2}\middle|\frac{I}{0\dots0}\right]$ , where $A$ and $B$ have different first columns (represented as $C_{1}, C_{2}$).
Thus we have $A = B+ \xi e_{1}^T$, where $\xi$ is a $n \times 1$ column vector and $e_{1}^{T} = [1, 0,\ldots,0]$.
$\textbf{Assumptions}:$
Let $\lambda_{i}, i=1, \ldots, n$ denote the eigenvalues of $AB^2$. Suppose we have the condition that $|\lambda_{i}|<1 \, \forall i$.
Let $\beta_{i}, i=1, \ldots, n$ denote the eigenvalues of $A^2B$. Suppose we have the condition that $|\beta_{i}|<1 \, \forall i$.
$\textbf{Claim:}$
$\textbf{a})$ If $\det(AB + A + I) = \det(BA+B+I) + e_{1}^{T} adj(A^2 +A+I) \xi$, then can we prove or disprove that $\det(AB + A + I)<0 \, {\rm{and}} \det(BA+B+I)<0$ cannot happen.
Thoughts:
Another way of stating it: Suppose $\det(AB+A+I)<0$ then we should have $\det(BA+B+I) \geq 0$. That implies $-e_{1}^{T} adj(A^2 +A+I) \xi < \det(AB+A+I)<0$. So it boils down to prove $-e_{1}^{T} adj(A^2 +A+I) \xi < \det(AB+A+I)<0$.
Another thought is to use the perturbation argument: Fix $A, B$. Define $B(\epsilon):= A + \epsilon (B-A)$. For $\epsilon = 0$, we get $\det(A^2+A+I) \geq 0$ and hence the statement holds. For $\epsilon=1$, we have $B(1) = B$. If the statement fails in this case then there should be a minimal $\epsilon$ for which the statement is false. Can we get a contradiction for $\epsilon < 1$?