EDIT:
Let $X,Y$ be random variables over some probability space with joint distribution $P$. Then the mutual information between two random variables is defined as
$I(X;Y):=\sum\limits_{(x,y)\in\text{supp}(P)}P(x,y)\log\frac{P(x,y)}{\sum\limits_rP(r,y)\sum\limits_sP(x,s)}$.
It is clear that mutual information $I(X;Y)$ can now be viewed as a function of joint distribution $P$. Hence we denote hereafter the mutual information associated with a joint distribution $P$ between two random variables as $I(P)$.
Now to the problem definition:
Assume $P_1,P_2$ be two joint distributions over $\{1,2,3...,N \}\times \{1,2,3,...N\}$ and hence $P_1,P_2 \in \mathbb{R}^{N \times N}$. Assuming $Q= \frac{P_1+P_2}{2}$, which again defines a new joint distribution over the same set over which $P_1,P_2$ were defined, I am trying to bound the mutual information $I(Q)$ using $I(P_1)$ and/or $I(P_2)$.
I have a bound on $I(Q)\ge\sum \limits_{i,j}\frac{P_1(i,j)}{2}\log{\frac{P_1(i,j)}{\gamma_{ij}}}+\sum \limits_{i,j}\frac{P_2(i,j)}{2}\log{\frac{P_2(i,j)}{\gamma_{ij}}}$
with
$\gamma_{ij}:= \left( \sum \limits_s \frac{P_1(i,s)+P_2(i,s)}{2} \right )$$\left( \sum \limits_r \frac{P_1(r,j)+P_2(r,j)}{2} \right)$. With this I am interested in lower bounding $\frac{1}{\gamma_{ij}}$. With some calculations I landed a lower bound of the form,
$\frac{1}{\gamma_{ij}}\ge \frac{1}{\sum \limits_rP_1(i,s)\sum \limits_rP_2(r,j)+\frac{1}{4}(\text{sum of three positive terms all $\le$ 1})}$
However, I am interested in knowing is there anyway I can further get a lower bound of the RHS of the above expression such that I have $\frac{1}{\sum \limits_rP_1(i,s)\sum \limits_rP_2(r,j)}$ term left.
The motivation is to get a neat lower bound on $I(Q)$ involving mutual informations of $P_1$ and $P_2$.
Any help/idea will be highly appreciated.
Thanks