Behavior of $f(x,y)= e^{-xy}+e^{1/y}$ as a function of $x$ at $y^{*}$ that minimizes the function at a given $x$

54 Views Asked by At

I have the following function $f(x,y)= e^{-xy}+e^{1/y}$.

I am trying to study its behavior as a function of $x$ at $y^{*}=g(x)$ that minimizes $f(x,y)$ for a given $x$.

Based on plots in Matlab, I see that it is a decreasing function of $x$ at $y^{*}$. However, I want to show this mathematically.

The function is not convex in terms of $y$ and there is no closed form solution for $y^{*}$.

Approach: I tried to prove that $y^{*}$ decreases with increase in the $x$ and $xy^{*}$ is constant. Then proceed to show $f(x,y^{*})$ is decreasing as function of $x$. No success.

Can someone help me to prove this?

3

There are 3 best solutions below

0
On BEST ANSWER

1) Condition that $y^*$ minimizes $f(x,y)$ for given $x$ reads: $\frac{\partial f(x,y)}{\partial y} \bigg\vert_{y*}=0$. (If $y*$ is a maximum or a minimum should be addressed for completion!).

2) Behavior of $f(x,y^*(x))$:

$\frac{d f(x,y^*(x))}{dx} = \frac{\partial f(x,y)}{\partial x} \bigg\vert_{y*(x)} + \frac{\partial f(x,y)}{\partial y} \bigg\vert_{y^*(x)} \times \frac{dy^*(x)}{dx}$.

Using $(1)$, from $(2)$ we see the second term on RHS is zero, so:

$\frac{d f(x,y^*(x))}{dx} = \frac{\partial f(x,y)}{\partial x} \bigg\vert_{y*(x)} = -y^*(x) \exp[-x y^*(x)]$.

The sign of the first derivative depends on the sign of $y = y^*(x)$, satisfying: $x y^2 e^{-xy} + e^{1/y}=0$. We note that if in your case $x \gt 0$, there is no real solution for $y^*$ (since both terms are positive, with their sum zero). If $x \le 0$, we need to prove (if to fit your observation of decreasing function with $x$) that $y^*(x)>0$.

We look, making the notation $x=-t \le 0$ at finding $y=y^*$ from: $$y^2 e^{ty-\frac{1}{y}} = \frac{1}{t} \, ; t \gt 0$$. If you plot the LFS (left-hand side) as a function of $y$ you will see that (seemingly, sorry i didn't check in detail) there are two solutions, one positive and one negative.

Remains to be seen which one gives the requested minimum of $f(x,y^*(x))$. According to your observations and the above, the $y^* \gt 0 $ should be the one.

Indeed, if we take the expression of the second derivative of $f(x,y)$ with respect to $y$ given in the answer of Samrat, we can simplify the expression using the condition of the first derivative to be zero, and we will obtain that for $x <0$ (as discussed above) and $y^*>0$, we have a minimum.

The only thing left to do is to see what kind of extremum is the other solution, $y^* <0$.

1
On

This is an incomplete answer, but might be useful in further understanding the nature of the function $f$.

Note that, $$\frac{\partial f}{\partial y}=-xe^{-xy}-\frac{1}{{y}^2}e^{1/y}.$$ Hence, $y^\star=\infty$, if $x\ge 0$, so that for $x\ge 0, f(x,y^\star)=1$. For $x\le 0$, $y^\star$ is the solution of $\partial f/\partial y=0$, and it must satisfy $$\frac{\partial^2f}{\partial y^2}\big|_{y^\star}\ge 0\Leftrightarrow x^2e^{-xy^\star}+\frac{e^{1/y^\star}+2ye^{1/y^\star}}{{y^\star}^4}\ge 0,\\\Leftrightarrow x^2e^{-xy^\star}-xe^{-xy^\star}\frac{1+2y^\star}{{y^\star}^2}\ge 0\Leftrightarrow \frac{1+2y^\star}{{y^\star}^2}-x\ge 0.$$ It is clear that the positive valued solution of $\partial f/\partial y|_{x}=0$ will definitely be a solution for $y^\star$. However, it does not seem clear whether the negative valued solution also is a minimizer. Now, $$\frac{\partial f}{\partial x}=-ye^{-xy}.$$ If $y^\star>0$, $\frac{\partial f}{\partial x}|_{y^\star}<0$, and the function is decreasing, for fixed $y=y^\star$, while if $y^\star<0$, the function is increasing. We only now need to check whether $y^\star<0$ is possible.

1
On

I don't have a solution, but I think I've made good progress toward one. Suppose you had found a function $g$ such that $y^* = g(x)$. Define $h(x) = f(x,y^*) = f(x,g(x))$. Your question amounts to showing that $h(x)$ is a decreasing function. This is equivalent to showing that $h'(x) < 0$, since $h$ (as you can check) is differentiable for all $x$. We have, by the chain rule $$ h'(x) = \frac{d}{dx} f(x,g(x)) = \begin{bmatrix}\frac{\partial f}{\partial x}(x,g(x)) & \frac{\partial f}{\partial y}(x,g(x)) \end{bmatrix} \begin{bmatrix} 1 \\ g'(x) \end{bmatrix} =\frac{\partial f}{\partial x}(x,g(x)) + \frac{\partial f}{\partial y}(x, g(x)) g'(x) $$ Now $$ \frac{\partial f}{\partial x} = -ye^{-xy} $$ and $$ \frac{\partial f}{\partial y} = -xe^{-xy} - \frac{1}{y^2} e^{1/y} $$ Substituting, we find $$ h'(x) = -g(x)e^{-xg(x)} - xe^{-xg(x)} - \frac{e^{1/g(x)}}{g^2(x)} = -\left( e^{-xg(x)}(g(x) + x) + \frac{e^{1/g(x)}}{g^2(x)} \right) $$ Observe that $e^{-xg(x)} > 0$ and $\frac{e^{1/g(x)}}{g^2(x)} > 0$ for all $x$. Thus, if we can show that $g(x) + x > 0$ for all $x$, we will have shown that $h'(x) < 0$ for all $x$.

In summary, your original problem can be reduced to showing that $g(x) + x > 0$ for all $x$.

To show that $g(x) + x > 0$ for all $x$, it suffices to find the minimum of $g(x) + x$ and show that it is greater than $0$.

Edit

Showing that $g(x) + x > 0$ will show that $h'(x) < 0$. In fact, it will show the stronger result that $h'(x) < -\frac{e^{1/g(x)}}{g^2(x)}$ for all $x$, which may or may not actually be true. If it turns out that this isn't actually true, then you'll just have to show that $h'(x) < 0$ directly.