Minimize the weighted sum of reciprocals

267 Views Asked by At

Let $\mathbf{a}_{i} \in \mathbb{R}^{M}$ with $\|\mathbf{a}_{i}\|^{2} = 1$, $\forall i = 1, \ldots, N$.

I need to solve the following problem in closed form:

\begin{align} \displaystyle \mathrm{argmin}_{\mathbf{X}} & \; \sum_{i=1}^{N} \frac{c_{i}}{\mathbf{a}_{i}^{\mathrm{H}} \mathbf{X} \mathbf{a}_{i}} \\ \displaystyle \mathrm{s.t.} & \; \mathbf{X} \succeq \mathbf{0}, \\ & \; \mathrm{tr}(\mathbf{X}) \leq 1 \end{align}

with $c_{i} > 0$, $\forall i = 1, \ldots, N$.

Tentative solution (1). The problem should be convex and thus I tried writing the KKT conditions

\begin{align} \mathrm{tr}(\mathbf{X}) \leq 1, \; \mathbf{X} & \succeq \mathbf{0}, \\ \mu \geq 0, \; \boldsymbol{\Psi} & \succeq \mathbf{0}, \\ \sum_{i=1}^{N} \frac{c_{i} \mathbf{a}_{i} \mathbf{a}_{i}^{\mathrm{H}}}{(\mathbf{a}_{i}^{\mathrm{H}} \mathbf{X} \mathbf{a}_{i})^{2}} + \mu \mathbf{I} - \boldsymbol{\Psi}& = 0, \\ \mu(\mathrm{tr}(\mathbf{X}) - 1) = 0, \; \boldsymbol{\Psi} \mathbf{X} & = \mathbf{0} \\ \end{align}

but they don't seem to provide a closed-form solution.

Tentative solution (2). I tried to change the optimization variable from $\mathbf{X}$ to $x_{1}, \ldots, x_{N}$ with $x_{i} = \mathbf{a}_{i}^{\mathrm{H}} \mathbf{X} \mathbf{a}_{i}$, but I don't know how to change the constraints accordingly (for sure, we'll have $0 \leq x_{i} \leq 1$, $\forall i = 1, \ldots, N$, but then what?).

2

There are 2 best solutions below

1
On

Along the lines of attempt (2), you can formulate the problem in terms of the eigenpairs $(\lambda_i,v_i)$ of $X$. We can assume that $\|v_i\|=1$. Let $a_i=\sum_{j=1}^n w_{ij} v_j$. Then the objective becomes $$ \sum_{i=1}^n\frac{c_i}{\sum_{j=1}^nw_{ij}^2\lambda_j}. $$ The constraints are $\lambda_i\geq0$, $\sum_{i=1}^n\lambda_i\leq1$. Take the norm squared of $a_i=\sum_{j=1}^n w_{ij} v_j$ to get $\sum_{j=1}^n w_{ij}^2=1$.

If you disregard the eigenvectors you can get a relaxation. Let $q_{ij}=w_{ij}^2\geq0$. Form vectors $q_i=[q_{i1}\ldots q_{in}]^T$. You get $$ \min \sum_{i=1}^n \frac{c_i}{q^T_i\lambda} $$ with the constraints $\lambda\geq0$, $1^T\lambda\leq1$, $q_i\geq0$, $1^Tq_i=1$. Since the constants $c_i$ are positive you want to maximize the denominator. Hence $q_i=\lambda$. The relaxed problem is equivalent to $$ \max_{1^T\lambda=1} \|\lambda\|^2 $$ This is just a QP with equality constraints. It has a closed form solution.

Not an answer, but maybe it can lead you somewhere.

0
On

I rewrite the question (cf. LinAlg's post).

One has $p$ vectors $a^1,\cdots,a^p\in \mathbb{R}^n$ and we consider the function

$f:X\in S_n^+\rightarrow \sum_{i=1}^p\dfrac{1}{a_i^TXa_i}$; we study $\min(f)$ under the condition $tr(X)= 1$.

Note that the $(c_i)$ are useless; they are here in order to get $||a^i||=1$. On the other hand, when $tr(X)\leq 1$, $f(X)\geq f(1/tr(X).X)$; then we may assume $tr(X)=1$.

I study an example when $n=p=2$

Let $a^1=[1,2]^T,a^2=[-1,1]^T$. We consider the conditions on $X=\begin{pmatrix}a&b\\b&c\end{pmatrix}$: $a>0,c>0,ac-b^2\geq 0,a+c=1$ (in general, we may assume that the $2$ first conditions are free).

On the other hand, there are $\lambda\in\mathbb{R},\mu\leq 0$ s.t.

$(1)$ $\dfrac{1}{(a+4b+4c)^2}+\dfrac{1}{(a-2b+c)^2}=\lambda+\mu c$

$(2)$ $\dfrac{2}{(a+4b+4c)^2}-\dfrac{1}{(a-2b+c)^2}=-\mu b$

$(3)$ $\dfrac{4}{(a+4b+4c)^2}+\dfrac{1}{(a-2b+c)^2}=\lambda+\mu a$.

$(4)$ $\mu (ac-b^2)=0$.

Beware, for the condition $(2)$, you must consider the symmetric derivative.

From $(1),(3)$, we see that $\mu\not=0$ and therefore $\det(X)=ac-b^2=0$, that is true in the general case for every $n$.

It's not difficult to deduce the equation in $a$:

$130a^4-641a^3+1158a^2-695a+49=0$.

The above polynomial has $2$ real roots:

$a_1\approx 0.08093763$ and $a_2\approx 0.995411341$. We deduce the associated values of $b,c,\mu$. The best candidate is the first one

$a_1,b \approx -0.27273931,c \approx 0.919062368,\mu\approx -0.503520803$.