If $0<x<\infty$, prove that $0<\frac{1}{e^x-1}-\frac{1}{x}+\frac{1}{2}<\frac{x}{12}$

94 Views Asked by At

Given that $f(x)=e^x(x^2-6x+12)-(x^2+6x+12),\;\;x>0$ is an increasing function. I want to prove that:

If $0<x<\infty$, then $0<\frac{1}{e^x-1}-\frac{1}{x}+\frac{1}{2}<\frac{x}{12}$.

Here is what I have done:

If $0<x<\infty$, then by Mean Value Theorem, $\exists\; c\in(0,x)$ such that

$$f'(c)=\frac{e^x(x^2-6x+12)-(x^2+6x+12)- 0}{x- 0}>0$$

but how do I get the desired inequality? Can anyone help out?

2

There are 2 best solutions below

0
On

We'll prove that $$0<\frac{1}{e^x-1}-\frac{1}{x}+\frac{1}{2}$$ or $$\frac{1}{e^x-1}>\frac{2-x}{2x},$$ which is obvious for $x\geq2$.

But, for $0<x<2$ we need to prove that $$e^x-1<\frac{2x}{2-x}$$ or $f(x)>0,$ where $$f(x)=\ln(x+2)-\ln(2-x)-x.$$ Indeed, $$f'(x)=\frac{x^2}{4-x^2}>0,$$ which says $$f(x)>\lim_{x\rightarrow0^+}f(x)=0$$ and the left inequality is proven.

By the same way we can prove a right inequality.

Indeed, we need to prove that $$\frac{1}{e^x-1}<\frac{x}{12}+\frac{1}{x}-\frac{1}{2}$$ or $$e^x-1>\frac{12x}{x^2-6x+12}$$ or $g(x)>0,$ where $$g(x)=x-\ln(x^2+6x+12)+\ln(x^2-6x+12)$$ and since $$g'(x)=\frac{x^4}{(x^2+6x+12)(x^2-6x+12)}>0,$$ we obtain: $$g(x)>\lim_{x\rightarrow0^+}g(x)=0$$ and we are done.

0
On

Michael Rozenberg's answer proves the inequalities without reference to the assumption that $f(x)=e^x(x^2-6x+12)-(x^2+6x+12)$ is increasing for $x\gt0$. This answer will show that that assumption can be used to prove the right-hand inequality, ${1\over e^x-1}-{1\over x}+{1\over2}\lt{x\over12}$.

As Michael points out, that inequality is equivalent to

$$e^x-1\gt{12x\over x^2-6x+12}$$

Since $x^2-6x+12=(x-3)^2+3\gt0$ for all $x$, we can clear out the denominator and move everything to the left hand side, obtaining the equivalent inequality

$$e^2(x^2-6x+12)-(x^2+6x+12)\gt0$$

At this point we'd like to say this follows because the left hand side is the increasing function $f(x)$, for which $f(0)=0$. However, all that "increasing" implies here is that $f(x)\ge0$ for $x\gt0$. We need to make things strict.

One way to show the strict inequality $f(x)\gt0$ for $x\gt0$ is to take derivatives until one of them is nonzero at $x=0$ and then check that it's positive. It turns out, though, that that means taking five derivatives. So we'll take a less computational approach:

If $f(a)=0$ for some $a\gt0$, then $f(x)=0$ for all x between $0$ and $a$, which means that

$$e^x={x^2+6x+12\over x^2-6x+12}\quad\text{for }0\lt x\lt a$$

But the exponential function is not a rational function on any interval. Hence we cannot have $f(x)=0$ for any $x\gt0$, so we must have $f(x)\gt0$ for all $x\gt0$.

To see why $e^x$ cannot agree with any rational function on any interval, suppose it did, e.g., $e^x=P(x)/Q(x)$ on an interval $(a,b)$. Then, by differentiating both sides, we would wind up with $P(x)Q(x)=P'(x)Q(x)-P(x)Q'(x)$ for all $x\in(a,b)$. But that's a polynomial equation (of degree $m+n$, $m$ and $n$ being the degrees of $P$ and $Q$), which has only finitely many solutions, not uncountably many.

As for the other inequality, $0\lt{1\over e^x-1}-{1\over x}+{1\over2}$, I don't see any way to prove it using the assumption on $f(x)$. The best alternative (to Michael's answer) I can think of is to rewrite the essential part to prove as

$$e^{2u}\lt{1+u\over1-u}=1+2u+2u^2+2u^3+2u^4+\cdots\quad\text{for }0\lt u\lt1$$

and then consider the Taylor series $e^{2u}=1+2u+2u^2+{4\over3}u^3+{2\over3}u^4+\cdots$. The strict inequality ${2^n\over n!}\lt2$ for $n\ge3$ is enough to finish the proof.