Finding the Expansion of a Separable Convex Optimization Problem

115 Views Asked by At

Hi there is a convex optimization problem in this paper which I am trying to implement in mosek. The author specifies that they also implement it using the separable optimization method.

Specifically the variable to optimize over is a vector $\pi = R^n$ of $n$ reals. Where $n$ is the cardinality of a set constructed.

The maximization problem is defined in section 3.4 on page 4 of the paper. Here is the problem restated from the paper:

$tf ^{(new)}$ is a vector of integers of length m

$\theta$ is a matrix of reals of size $m$ by $|S|$. $\theta_j^{(i)}$ is element of row $j$ column $i$.


$$\max_{\pi \in R^{|S|}} \sum_{j = 1}^{m} tf^{(new)}_j log(\sum_{i \in S} \pi_i\theta_j^{(i)})$$

subject to:

1) $\sum_{i \in S}\pi_i = 1 \\$

2) $\forall i \in S : \pi_1 \geq 0$


I've managed to re write the expression as

$$ tf^{(new)}_1 log(\pi \cdot\theta_1 ) + ... + tf^{(new)}_{m} log(\pi \cdot\theta_{m} )$$

I would like to decompose this problem that each term uses only one element of $\pi$ so I can use this solver Separable convex solver. But as you can see this is not the case in the expression above because each terms still caries the complete $\pi$ variable vector. How can this optimization problem be decomposed into separable terms?

1

There are 1 best solutions below

1
On BEST ANSWER

Assuming the $\theta_j$ are linearly independent, you can change to a new basis of ${\mathbb R}^n$ so that $\pi \cdot \theta_1, \ldots, \pi \cdot \theta_m$ are the first $m$ coordinates with respect to the new basis.