I am trying to create a integer programming program to run over all possible $2$ trees, $T_0, T_1$ with $n$ leaves, such that they minimize some certain cost function $f(T_0, T_1)$.
I know the standard way of building integer programming for trees is to define indicator random variables $v_{d,t}$ where its one if there is a leave at depth $d$ and node $t$ corresponds to it, however, in my case, things are a bit more harder.
My function $f(T_0, T_1)$ is nonlinear, so I am not sure how the integer programming would work, I am describing the problem below in the hopes someone can guide me in the right way:
For any tree $T_s, s\in \{0,1\}$ with $n$ leaves, We describe a subset of the nodes of the tree as leaves $L_{T_s}$ and another disjoint subset as master nodes $M_{T_s}$. Also, each node $a\in L_{T_s}, M_{T_s}$ corresponds to a separate weight $P(a)$. In addition, for each level in the tree $T_s$, I define the weighted sum $w_d$ as the sum of weights corresponding to all nodes with depth $>=d$ in $T_s$.
My objective function $f(T_0, T_1)$ is given by $$f(T_0, T_1)=\frac{\sum_{a\in L_{T_1}}P(a)\sum_{d=0}^{depth(T_0)-1}w_d + \sum_{a\in M_{T_1}}P(a)\sum_{d=0}^{depth(T_1)-1}w_d}{ \sum_{a\in L_{T_1}}P(a)+\sum_{a\in M_{T_1}}P(a) }$$
However, I am having trouble with the denominator, since it makes my function non linear. Is there any way I can linearize this integer programming problem? (I know how I would solve the problem if the denominator wasn't there).