I am interested in first order necessary conditions for the following minimization problem where the function $f$ is continuous, nondecreasing and concave, with $f(0)=0$, but not necessarily differentiable:
$\min \frac{\sum_{i=1}^n f(y_i)}{\sum_{i=1}^n f(x_i)}$
subject to:
$y_i = \sum_{j=i}^n x_j$
$0 \le x_i \le L$ for all $1 \le i \le n$.
Note that if we know that the minimum value of the objective is $F$, then we could equivalently write the objective as:
$\min \sum_{i=1}^n f(y_i) - F \sum_{i=1}^n f(x_i)$
if that is helpful.
The problem here is that the objective is not differentiable or convex but it seems that the obvious subgradient version of the KKT conditions should hold: do they?