Minimization of $\frac{a_1+a_2x_2+a_3x_3}{b_1+b_2x_2+b_3x_3}$

143 Views Asked by At

Assume $\frac{a_1}{b_1}<\frac{a_2}{b_2}<\frac{a_3}{b_3}$, where $a_i,b_i >0$. Assume $x_2 \ge \frac{1}{8}$ and $x_3 \ge \frac{1}{8}x_2$. I guess that the minimal value of $$\frac{a_1+a_2x_2+a_3x_3}{b_1+b_2x_2+b_3x_3}$$ is obtained when $x_2=\frac{1}{8}$ and $x_3=\frac{1}{8^2}$. How can I prove the result?

1

There are 1 best solutions below

2
On BEST ANSWER

Let $f:\mathbb{R}^2\to\mathbb{R}$ be defined by \begin{align} f(x,y)&:=\dfrac{a_1+a_2x+a_3y}{b_1+b_2x+b_3y}. \end{align} $f$ is smooth (hence continuous) on $\mathbb{R}^2$ as a well defined quotient of affine functions. Fix $R>0$ ; on the rectangle \begin{align} \mathcal{R}_R&:=\{(x,y)\in\mathbb{R}^2\mid 1/8\leq x\leq R, x/8\leq y\leq R\}, \end{align} $f$ is bounded and attains its minimum (since the rectangle is a compact set).

A local extremum of $f$ in $\mathcal{R}_R$ is attained for points $(x,y)$ such that the differential of $f$ cancels, i.e. for \begin{align} 0=\mathrm{d}f(x,y)=\dfrac{a_2(b_1+b_2x+b_3y)-b_2(a_1+a_2x+a_3y)}{(b_1+b_2x+b_3y)^2}\mathrm{d}x+\dfrac{a_3(b_1+b_2x+b_3y)-b_3(a_1+a_2x+a_3y)}{(b_1+b_2x+b_3y)^2}\mathrm{d}y. \end{align} This happens precisely if \begin{align} \begin{cases} a_2(b_1+b_2x+b_3y)-b_2(a_1+a_2x+a_3y)&=0\\ a_3(b_1+b_2x+b_3y)-b_3(a_1+a_2x+a_3y)&=0 \end{cases} \end{align} i.e. if \begin{align} \begin{cases} a_2b_1+a_2b_3y&=a_1b_2+a_3b_2y\\ a_3b_1+a_3b_2x&=a_1b_3+a_2b_3x \end{cases}. \end{align} Hence, we get \begin{align} \mathrm{d}f(x,y)=0\quad\Longleftrightarrow\quad(x,y)=\left(\dfrac{a_1b_3-a_3b_1}{a_3b_2-a_2b_3},\dfrac{a_1b_2-a_2b_1}{a_2b_3-a_3b_2}\right). \end{align}

This critical point does not depend on the choice of the constant $R$, so it is the unique local minimum or maximum of $f$ in $\mathbb{R}^2$. As I do not know the values of the $a_i$ and the $b_j$, $(i,j)\in\{1,2,3\}^2$, I will tell you how you can proceed to conclude :

$\bullet$ If the above point is a minimum (you can check this for example by computing the partial derivatives $f$ in this point ; for a local minimum, the partial derivatives must be positive), then you are done because the function must increase in all the directions (if not, there would be at least one another critical point where $\mathrm{d}f$ cancels).

$\bullet$ If it is a maximum for $f$, then your function decreases in all the directions and the infimum is attained on the boundary lines $\{(x,y)\in\mathbb{R}^2\mid x=1/8\text{ or }y=1/64\}$ (in this case, it is the minimum of $f$) or at infinity.

$\bullet$ If it is a saddle-node point, then there is a direction where $f$ is decreasing, so you can restrict $f$ in this direction and minimize it (there will be only one variable in this case). This direction is the vector line spanned by the eigenvector associated with the negative eigenvalue of the Hessian of $f$ evaluated at the above critical point (it is not necessarily one of the canonical vectors of $\mathbb{R}^2$ ; it can be a linear combination of them).