Boyd & Vandenberghe, problem 3.49 (c) — proving that a function is log-concave

386 Views Asked by At

In the problem 3.49(c) of the book it is asked to prove that product over sum function is log-concave. I have some questions related to the solution of this problem which I have mentioned in the following. Please clarify them.

In the solution manual, they have shown that the double derivative of $$\tilde{f}(t)=\sum_{i}\log(x_i+tv_i)-\log \sum_{i}(x_i+tv_i)$$ is less than equal to zero at $t=0$. The double derivative of $\tilde{f}(t)$ at $t=0$ is given as $$\tilde{f}(0)''=-\sum_{i}\frac{v_i^2}{x_i^2}+\frac{(\sum_iv_i)^2}{(\sum_i x_i)^2}$$ which should be $\leq 0$ for all $v$. It is written in the solution manual that when $\sum_iv_i\neq 0$ then $$\tilde{f}(t)=\sum_{i}\log(x_i+tv_i)-\log \sum_{i}(x_i+tv_i)\leq 0 \tag1$$ is homogeneous of degree two in $v$ (what does this mean and why it is homogenous of degree two in $v$?) and due to this reason they say that it can be assumed that $$\sum_iv_i=\sum_ix_i$$ (why it can be assumed without generality?). After this they say that Eq. (1) is equivalent to showing that $$\sum_{i}\frac{v_i^2}{x_i^2}\geq 1$$holds whenever $\sum_iv_i=\sum_ix_i$. To establish this they fix a value of $x$ and minimize the convex quadratic form $\sum_{i}\frac{v_i^2}{x_i^2}$ over $\sum_iv_i=\sum_ix_i$. After this I do not understand how they reach to the conclusion that the optimality condition gives $$\frac{v_i}{x_i^2}=\lambda$$

Please explain this part too. Is there some use of KKT conditions? (Actually, up till chapter 3 the book does not discuss about KKT conditions so I do not understand even if KKT conditions are used.) Any help in this regard will be much appreciated. Thanks in advance.

2

There are 2 best solutions below

3
On BEST ANSWER

Homogeneous of degree 2 means that if you scale $v$ by a factor $c$, then the expression $\tilde{f}''(0)$ increases by a factor $c^2$. Since it is $c^2$, the sign of the inequality $\tilde{f}''(0)\leq 0$ does not change after rescaling $v$, even if $c<0$. Therefore, we can always rescale $v$ to have its sum equal to the sum of $x$.

Then they study the problem: $$\min_v \left\{ \sum_i \frac{v_i^2}{x_i^2} : \sum_i x_i = \sum_i v_i \right\}$$ The Lagrangian is $L(v,\lambda) = \sum_i v_i^2 / x_i^2 + \lambda(\sum_i x_i - \sum_i v_i)$, so the stationarity condition (taking the derivative with respect to $v_i$) is $2 v_i / x_i^2 - \lambda = 0$ for $i=1,2,\ldots,n$, leading to $\lambda = 2v_i/x_i^2$. The factor 2 is not that relevant, it will cancel out in solving $\sum_i x_i = \sum_i v_i$.

0
On

Here is an alternative proof since I noticed you asked about $v$ and $t$ before. The hessian of $f$ is: $$H=-\begin{pmatrix} \frac{1}{x_1^2} & c & \cdots & c \\ c & \frac{1}{x_2^2} & \cdots & c \\ \vdots & \vdots & \ddots & \vdots \\ c & c & \cdots & \frac{1}{x_n^2} \end{pmatrix}$$ with all off-diagonal elements $c=1/(\sum_i x_i)$. We need it to be negative semidefinite for concavity, so: $$v^THv= - \sum_i \frac{v_i^2}{x_i^2} + \frac{1}{(\sum_i x_i)^2}\sum_{i,j : i \neq j}v_i v_j \leq 0 \quad \forall v.$$ This is also homogeneous of degree 2, so (assuming $v \neq 0$) we can rescale $v$ to get $\sum_{i,j : i \neq j}v_i v_j = (\sum_i x_i)^2$. The remainder of the proof is similar.