Basic questions about optimizing concave function with constraints

73 Views Asked by At

Consider the following problem:

\begin{align} {\tt Maximize} \quad M(\mathbf y)& = \log \Big(\prod_i U_i(y_i) \Big) \\ y_i & = \sum_{j=1}^{m} \frac{x_{ij}}{a_{ij}} \\ \sum_{i=1}^{n} \frac{c_i x_{ij}}{a_{ij}} & \le A_j \qquad {\tt for } \; j \in \{1,\cdots,m\} \\ \sum_{i=1}^n \sum_{j=1}^m x_{ij} & \le B \end{align}

I want to see if it is a tractable problem. The tricky point is that the objective function is written in terms of $\mathbf y$, while the constraints are in terms of $x_{ij}$'s.

  • I can show that $M(\cdot)$ is a concave function of $\mathbf y$ without considering $x_{ij}$'s into account. Is that sufficient?
  • Is that true that $\mathbf y$ has convex constraints? I think it's immediate from the fact that we have convex constraints on $x_{ij}$'s and $y_i$ is a linear combination of $x_{ij}$'s. Correct?
1

There are 1 best solutions below

0
On

Given that your optimization variables are $x$ and $y$ and the rest are constant, and given that you can show that your objective $M$ is a concave function, your problem is indeed a convex optimization problem and (theoretically) should be tractable.

Why 'should be'? Because it depends on what you know about the functions $U_i$. Among tractable cases are the following (of course, there might be other tractable cases):

  • $M(\cdot)$ is continuously differentiable, and you can efficiently compute its gradient
  • You can efficiently compute a subgradient of $-M(\cdot)$