Given a data set, I need to find the minimal value of the total values divided by the total weights. The data sets are of arbitrary length (each data set may have a different length, but the data set and the corresponding weights have the same lengths).
Considering a simple example:
DataSetOne [500, 45]
WeightsOfSetOne = [10, 1]
DataSetTwo [850, 90]
WeightsOfSetTwo = [10, 1]
Given these data sets, you can get 4 results.
[1] (500 + 850)/(10+10) = 1350/20 = 67.5
[2] (500 + 90)/(10+1) = 590/11 = 53.6
[3] (45 + 850)/(1+10) = 895/11 = 81.4
[4] (45 + 90)/(1 + 1) = 135/2 = 67.5
In this case, we see that [2] gives the optimal solution. However, this is a brute force method and I would rather calculate it Mathematically (if possible) because brute force simply is not possible since the number of possibilities scales exponentially with the number of data sets.
The approach I tried was as follows:
For each of the values with their corresponding weights of Data_set_one:
Start with a (value + weight) from the first Data set. Then for all the other data sets:
Traverse each data set and choose the option in the data set that gives the minimum value for Total value so far/Total Weight so far.
However. This approach failed with the data set below.
Data_set_one_values: [50, 72, 9]
Data_set_one_weights [52, 32, 3]
Data_set_two_values: [87, 63, 0]
Data_set_two_weights [43, 88, 36]
Data_set_three_values: [83, 16, 94]
Data_set_three_weights [38, 22, 56]
Data_set_four_values: [22, 61, 37]
Data_set_four_weights: [25, 13, 55]
Data_set_five_values: [1, 15, 52]
Data_set_five_weights: [53, 80, 43]
If there is no exact way to calculate this except a brute-force method by calculating all combinations, is there a way perhaps that I can get an approximation that works?
One approach is to use mixed integer linear programming. For each set $i$ and item $j$, let $v_{i,j}$ denote the value and $w_{i,j}$ denote the weight, and let binary decision variable $x_{i,j}$ indicate whether the item is selected. The problem is to minimize $$\frac{\displaystyle{\sum_{i,j} v_{i,j} x_{i,j}}}{\displaystyle{\sum_{i,j} w_{i,j} x_{i,j}}}$$ subject to $$\sum_j x_{i,j} = 1$$ for all $i$.
You can linearize the fractional objective via a Charnes-Cooper transformation.
Here is SAS code for your second instance:
The resulting optimal solution, with minimum ratio $48/139=0.345$, is: \begin{matrix} i &j &x &v &w\\ \hline 1 &1 &0 &50 &52 \\ 1 &2 &0 &72 &32 \\ 1 &3 &1 &9 &3 \\ 2 &1 &0 &87 &43 \\ 2 &2 &0 &63 &88 \\ 2 &3 &1 &0 &36 \\ 3 &1 &0 &83 &38 \\ 3 &2 &1 &16 &22 \\ 3 &3 &0 &94 &56 \\ 4 &1 &1 &22 &25 \\ 4 &2 &0 &61 &13 \\ 4 &3 &0 &37 &55 \\ 5 &1 &1 &1 &53 \\ 5 &2 &0 &15 &80\\ 5 &3 &0 &52 &43 \end{matrix}