As a constraint in an optimization program I have a sum of logarithm, $$\sum_{i=1}^{K}\log(1+c_ix_i)\le e,$$ where $\mathbf{c} = (c_1,c_2,\cdots,c_K)$ is a constant positive vector and $\mathbf{x}=(x_1,x_2,\cdots,x_K)$ is the vector of (positive) variables. Elements in $\mathbf{c}$ and $\mathbf{x}$ are arranged descending order and $e$ is a constant factor.
Since this constraint doesn't describe a convex set, the program is nonconvex. So I am looking for a way for relaxing this sum into an affine function. For example, if $e=.001\ll1$, it's safe to use Taylor approximation and transform the constraint into, $$\sum_{i=1}^{K}\log(1+c_ix_i) \approx \sum_{i=1}^{K}{c_ix_i}\le e ~.$$
However, my problem requires that $e\approx 1$. In this case it is definitely true to say that the largest term dominates all the other term with the inequality $c_1x_1 \le 2^e -1,$ but this doesn't imply that the original (sum of logarithm) condition is satisfied ( the final sum can be in fact larger than $e$, because there was no upper-bound on other $x_i,~ i =2,\cdots,K$.
How should I relax my problem into a convex one? I think the last resort is a piecewise linear approximation to logarithm blocks of the sum. Any comment or help is very appreciated.
By changing variables $x_i=e^{y_i}$, the constraint becomes convex. That is, $\sum_i \log(1+c_i e^{y_i})$ is a convex function of $y_i$. This is the idea behind geometric programming.
Unfortunately, whether or not this transformation actually helps depends on the nature of your objective and the other constraints. After making this change of variables, the transformed objective function may or may not be convex. For example, if $f(x)=-x$, then $g(y)=-e^y$ is not convex.
To be more precise, if $f(x_1,\ldots,x_K)$ is a twice differentiable function and $g(y_1,\ldots,y_K)=f(e^{y_1},\ldots,e^{y_K})$, then $g$ is convex everywhere if and only if:
$$ \nabla^2 g = \operatorname{diag}(e^\mathbf{y}) \nabla^2 f(e^\mathbf{y}) \operatorname{diag}(e^\mathbf{y}) + \operatorname{diag}( \nabla f(e^\mathbf{y}) e^\mathbf{y}) \succeq 0 $$
This can be simplified to:
$$ \nabla^2 f (\mathbf{x}) + \operatorname{diag}( \nabla f(\mathbf{x}) / \mathbf{x} ) \succeq 0 $$
So, a simple sufficient condition is that $g$ is convex if $f$ is convex and the gradient of $f$ is nonnegative for all positive $\mathbf{x}$.