How to minimize $\frac{1}{y_1} + \frac{1}{y_2} + \cdots + \frac{1}{y_k}$ subject to $y_1+y_2+ \cdots + y_k = a$?

242 Views Asked by At

We seek how to minimize $$ \frac{1}{1-x_1} + \frac{1}{1-x_2} + \cdots + \frac{1}{1-x_k} $$ for $x_1,\ldots,x_k \in (0,1)$ subject to $$ x_1+x_2+ \cdots + x_k = c $$ for some constant $c \in (0,k)$. We expect this minimum occurs when $x_1=x_2=\cdots=x_k=\frac{c}{k}$.

A first step seems to be to transform it to the problem of minimizing $$ \frac{1}{y_1} + \frac{1}{y_2} + \cdots + \frac{1}{y_k} $$ where $y_1,\ldots,y_k \in (0,1)$ subject to $$ y_1+y_2+ \cdots + y_k = a $$ where $a=k-c$. Here, we expect this minimum occurs when $y_1=y_2=\cdots=y_k=\frac{a}{k}$.

We can prove this when $k=2$, since $$ \frac{1}{y_1}+\frac{1}{y_2}=\frac{y_1+y_2}{y_1y_2}=\frac{y_1+(a-y_1)}{y_1(a-y_1)}=\frac{a}{y_1(a-y_1)} $$ and taking the derivative of the denominator and equating it to zero gives the minimum when $y_1=y_2=a/2$. Generalizing this for $3$ or more terms does not seem straightforward. It seems possible there's a nice method for proving this that I'm unaware of.

Other questions on similar topics do not seem to solve this problem, e.g. minimizing the sum of reciprocals is equivalent to maximizing the sum doesn't have the constraint that $y_1+\cdots+y_k$ is fixed. The question Minimizing Sum of Reciprocals has an additional constraint.

4

There are 4 best solutions below

0
On

Note that by AM-HM inequality, $$\frac{\frac{1}{1-x_1}+\frac{1}{1-x_2}+\cdots+\frac{1}{1-x_k}}{k} \ge \frac{k}{1-x_1+1-x_2+\cdots+1-x_k} = \frac{k}{k-c}.$$ The minimum is achieved when $x_1=x_2=\cdots=x_k=\frac{c}{k}$.

0
On

It's just the Cauchy Schwartz inequality written

$$\sum_{i=1}^k x_i= c\Rightarrow \sum_{i=1}^k (1-x_i)=k-c$$

Hence by Cauchy Schwartz inequality $$\left(\sum_{i=1}^k (1-x_i)\right) \left(\sum_{i=1}^k \left(\frac {1}{1-x_i}\right) \right) \ge k^2 \Rightarrow \sum_{i=1}^k \left(\frac {1}{1-x_i}\right)\ge \frac {k^2}{k-c}$$

0
On

Two answers with specific inequalities have already been given. If you want to prove it from first principles, you can use a Lagrange multiplier:

$$ \frac\partial{\partial y_k}\left(\sum_iy_i^{-1}+\lambda\sum_iy_i\right)=-y_k^{-2}+\lambda=0 $$

and thus

$$ y_k=\lambda^{-\frac12}\;, $$

a constant. Then you just have to compare with values on the boundary, where any number of the $y_i$ could be $1$ and the rest again equal inside $(0,1)$; you can prove as in the two-variable case that this leads to a higher value of the objective function than if all of them are equal.

0
On

Let $f(y_1,...,y_k)=\frac {1} {y_1} + ... + \frac 1 {y_k}$ be the function to minimize, and $g(y_1,...,y_k)=y_1+...+y_k -1$. Clearly, a minimum exists (hint: look at the behavior of $f$ on its domain).

You're aiming to minimize $f$ on $(0,1)^k$ under the linear constraint that $g(y_1,...,y_k)=0$.

This is what the Lagrange multiplier technique is for. I recommend you read about it, it's pretty cool (there is a geometric interpretation to that technique, it's great to build intuition)! In this case, it will state that the minimum, if it exists, can only be at points where the gradients of $f$ and $g$ are collinear. In other words, a necessary condition for the minimum to be achieved at $y=(y_1,...,y_k)$ is that there exists $\lambda \in \mathbb R$ such that, for all $i$, $$\frac {\partial f}{\partial y_i}(y) = \lambda \frac {\partial g}{\partial y_i}(y)$$ That means that such a $\lambda$ must be negative, and that for all $i$, $$-\frac 1 {y_i^2} = \lambda $$ so $y_1=y_2=...=y_k=-\frac 1 {\sqrt{-\lambda}}$