I would like to find settings for a clock generation circuit, that can generate multiple clocks. The clock frequency generated $F_i$ is calculated as follows:
$F_i = F_{input} * \frac{M}{D * O_i}$
Here $M$ and $D$ are variables shared for all generated $F_i$ while $O_i$ is a separate variable for each $F_i$ and $F_{input}$ is a constant.
As we are talking about a digital circuit, the variables are limited to fairly small integer ranges (apart from $F_{input}, F_i$ and $F_{desired}$).
I now want to find $M, D$ and $O_i$ so that the maximum relative error $F_{error_i} = F_{desired} - F_i$ is minimized ($F_i <= F_{desired}$).
I can word all this as an integer linear program, except my minimization goal. I would set my minimzation goal as
$min(\sum{F_{error_i}^2})$
which does not fit well into an ILP. How can I formulate my problem in a way that a computer can solve this? (Preferably not bruteforce)