Matrix $A$ and $B$ are symmetric positive semidefinite. I wish to lower bound the minimal eigenvalue of the matrix product:
$$B(A+B)^{-1}A$$
The first lower bound I came up with is:
$$\lambda_{\min} \left( B (A+B)^{-1} A \right) \geq \frac{\lambda_{\min}(A) \cdot \lambda_{\min}(B)}{\lambda_{\max}(A+B)}\geq\frac{\lambda_{\min}(A) \cdot \lambda_{\min}(B)}{\lambda_{\max}(A)+\lambda_{\max}(B)}$$
According to this lower bound, if we do rank-one update on $A$ and $B$ in such a way that $\lambda_{\min}(A)$ and $\lambda_{min}(B)$ remain the same, but $\lambda_{\max}(A)$ and $\lambda_{\max}(B)$ increase (by updating A and B using eigenvectors corresponding to their maximum eigenvalues), then the lower bound will decrease.
However, my simulation results show that the minimum eigenvalue of this matrix product actually always increases as I rank-one updating $A$ and $B$.
Then my question is, whether there is any tighter lower bound of the minimum eigenvalue of this product?
Assume both $A$ and $B$ are invertible.
To lower bound the minimum eigenvalue, we need the following result. $$B(A+B)^{-1}A=(A^{-1}+B^{-1})^{-1}$$ Then it directly leads to: $$\lambda_{\min} \left( B (A+B)^{-1} A \right) = \frac{1}{\lambda_{max}(A^{-1}+B^{-1})}\geq \frac{1}{\lambda_{max}(A^{-1})+\lambda_{max}(B^{-1})}=\frac{1}{\frac{1}{\lambda_{min}(A)}+\frac{1}{\lambda_{min}(B)}}$$
To show that $B(A+B)^{-1}A=(A^{-1}+B^{-1})^{-1}$, we start from: $$A^{-1}(A+B)B^{-1}=A^{-1}AB^{-1}+A^{-1}BB^{-1}=B^{-1}+A^{-1}$$ Then taking inverse of both sides gives the result.