I have $f_1, f_2, g_1, g_2$ functions which are linear. (They are basically finite summations). Can we Linearize $\frac{f_1 g_1 + f_2g_2}{f_1+f_2}$ using some sort of factorization? By Linearize, I mean write the expression a linear terms and getting rid of the denominator?
Edit:
To remove ambiguity. I will explicitly state $f_1, f_2, g_1, g_2$.
I am trying to solve an integer programming problem over the set of all trees with $n$ leaves.
Fix two trees $T_0, T_1$ with $n$ leaves. Fix $L_{T_s}, M_{T_s}, s\in \{0,1\}$ as disjoint subsets of nodes in the trees $T_0, T_1$ (We call them leaves (L) and masters (M) nodes). Finally, define a function $P$ that assigns each node a different value. Then I have:
$f_1 = \sum_{a \in L_{T_1}}P(a)$
$f_2 = \sum_{a \in M_{T_0}}P(a)$
$g_1 = \sum_{d=0}^{depth(T_0)-1}w_d$
$g_2 = \sum_{d=0}^{depth(T_1)-1}w_d$
Then I want to optimize the function $\frac{f_1g_1+f_2g_2}{f_1+f_2}$ over all possible trees. However, I am trying to express this in terms of a linear function to solve it with regular integer programming teqniques instead of fraction so I can do integer programming on it.