Compute infimum

126 Views Asked by At

I want to compute the following infimum: $$ \inf\limits_{x_1,\ldots,x_n \geqslant 0} \dfrac{x_1 y_1 + \ldots + x_n y_n}{(a_1 x_1^\alpha + \ldots + a_n x_n^\alpha)^{\frac 1 \alpha}} $$ where $y = (y_1,\ldots,y_n) >0$ and $a=(a_1,\ldots,a_n)>0$, $\alpha \leqslant 1$, $\alpha \neq 0$. What is the easiest way to do this? The straightforward way is to compute partial derivatives and receive a system on $x_1,\ldots,x_n$. But this approach yields a huge system of equations.

Lets denote $\beta:$ $\frac 1 \alpha + \frac 1 \beta = 1$, $b=(b_1,\ldots,b_n)$, where $b_i = a_i^{-\frac \beta \alpha}$, $i=\overline{1,n}$. Then by solving optimization problem $x \cdot y \to \mathrm{extr}$ on the set $a_1 x_1^\alpha+\ldots + a_n x_n^\alpha = 1$ we obtain a function $$ (b_1 y_1^\beta + \ldots + b_n y_n^\beta)^{\frac 1 \beta}. $$ Will this function be a minimum? Interesting consequence then is the inverse Holder inequality: $$ x_1 y_1 + \ldots + x_n y_n \geqslant (x_1^\alpha+\ldots+x_n^\alpha)^{\frac 1 \alpha} (y_1^\beta + \ldots + y_n^\beta)^{\frac 1 \beta}, \; \frac 1 \alpha + \frac 1 \beta = 1. $$

1

There are 1 best solutions below

1
On BEST ANSWER

Equivalently, you look for the maximum of $\sum a_j x_j^\alpha$ on the simplex $\sum x_jy_j=1$, $x_j\ge 0$. By compactness, the maximum is attained somewhere. Can it be on a boundary point, where some $x_j$ vanishes? No, because at such point the partial derivative with respect to $x_j$ is infinite, allowing us to increase the function by a little perturbation. Precisely, pick $k$ such that $x_k>0$ and consider the competitor with $x_j$ and $x_k$ replaced with $\epsilon $ and $x_k-(y_j/y_k)\epsilon$, respectively. The derivative of $\sum a_j x_j^\alpha$ with respect to $\epsilon $ is $$a_j \alpha\, \epsilon^{\alpha-1} - O(1)$$ which is posiive for all small $\epsilon$. So, there is no local maximum at a boundary point.

Therefore, the maximum is attained is a stationary point in the interior of the simplex, which you already found.