I am not familar with the optimization problems, but I want to know a very simple formulation: \begin{array}{cc} {{\max_\mathbf{x}}} & \frac{\mathbf{a}^{T}\mathbf{x}}{\mathbf{b}^{T}\mathbf{x}}\\ {\rm s.t.} & x_{1}+x_{2}+\cdots+x_{n}\leq c \end{array} where $\mathbf{a}$ and $\mathbf{b}$are $n\times1$ real vectors with positive elements, $c$ is also a constant, and $x_i\geq0$.
I have found that this form is called linear fractional programming. But is there a closed-form solution of the above problem? Or whether the solution has a special structure? Thanks.
The solution $\mathbf{x}^\star$ to your optimization problem is given by
$0 < x_j^\star \leq c $ for $j = \arg \max_i \frac{a_i}{b_i}$,
$x_i^\star = 0$ for $i\neq j$.
How can we proof this? Having a look at http://en.wikipedia.org/wiki/Linear-fractional_programming#Duality we can formulate the dual problem of your fractional problem as \begin{equation} \min \lambda ~~~\text{ subject to}~~ [-\mathbf{I},\mathbf{1}]^T \mathbf{u} + \lambda \mathbf{b} = \mathbf{a} \\ -[\mathbf{0}^T,c] \mathbf{u} \geq 0 \\ \mathbf{u} \in \mathrm{R}^{n+1}_+, \lambda \in \mathrm{R}. \end{equation} Here $\mathbf{I}$ is the $n\times n$ identity matrix, $\mathbf{1}$ is the $n \times 1$ vector having 1s everywhere, and $\mathbf{0}$ is the $n \times 1$ vector having 0s everywhere.