Prove that $2 \int_{x}^{x+1} \log(t) dt \geq \log(x (x+1))$.

302 Views Asked by At

Prove that $2 \int_{x}^{x+1} \log(t) \,dt \geq \log(x (x+1))$.

I want proof of this without using geometry. I need to know the techniques to solve this other than geometry.

One can easily see this inequality is true for $x>0$ just by drawing $\log(t)$ (real) graph and observing the area of the trapezium which joins $x, x+1, \log(x+1)$ and $\log(x)$ is less than area under the curve between $x$ and $x+1$.

3

There are 3 best solutions below

2
On BEST ANSWER

Changing the variable of integration in two different ways: \begin{multline*} 2\int_x^{x+1}\log(t)\,dt = \int_0^1\log(x+u)\,du + \int_0^1\log(x+1-u)\,du \\ = \int_0^1[\log(x+u) + \log(x+1-u)]\,du = \int_0^1\log[(x+u)(x+1-u)]\,du \\ = \int_0^1\log[x(x+1)+u(1-u)]\,du \geqslant \int_0^1\log[x(x+1)]\,du = \log[x(x+1)]. \end{multline*}

4
On

Let $f(x)=2 \int_{x}^{x+1} \ln(t) \,dt - \ln(x (x+1))$

hence $f'(x)=2 \ln(1+\frac{1}{x})-\frac{(2x+1)}{x(x+1)}$

$$f''(x)=\frac{-2}{x^2+x}+(\frac{2x^2+2x+1}{x^2(x+1)^2})=\frac{1}{x^2(x+1)^2}$$

Now $$f''(x)>0$$

Which means $f'(x)$ is increasing function

Now $\displaystyle \lim_{x \to +\infty}f'(x)=0$

Which means $f'(x)<0$

It follows $f(x)$ is an decreasing function

Now $f(x)=\ln( \frac{(1+\frac{1}{x})^{2x+1})}{e^2})$

$\displaystyle \lim_{x \to +\infty}f(x)=\ln 1=0$, Which means $f(x)>0$

Which suggests $f(x)=2 \int_{x}^{x+1} \ln(t) \,dt - \ln(x (x+1))>0$

It follows

$2 \int_{x}^{x+1} \ln(t) \,dt>\ln(x (x+1))$

3
On

Although an answer has been accepted, I am writing this answer as the given problem could be deduced as a special case of some general one. This arose from the observation that the given inequality looks like: $$\int\limits_{a}^{b} f(t)\, dt \geq (b-a) \frac{f(a)+f(b)}{2}.$$

Suppose, we ask the question under what conditions on $f$, the above inequality holds good, it turns out that

$\textbf{Statement:}$ if $f(x)$ is twice differentiable and concave on $[a,b]$, then $$ \int\limits_{a}^{b} f(t)\, dt \geq (b-a) \frac{f(a)+f(b)}{2}.$$

To see why this is so, consider the function $H(x)$ defined by $$H(x) = \int\limits_{a}^x f(t)\, dt - (x-a) \frac{f(x)+f(a)}{2}.$$

Clearly

$$H^{(2)}(x) = -(x-a)f^{(2)}(x)/2 \geq 0, \,\, \forall \,x\in[a,b].$$

Moreover, $H(a)=0 =H^{\prime}(a)$. Thus it can never happen that for any $x\in [a,b]$, $H(x) < 0$ (Why?). Hence $$H(x) \geq 0, \,\, \forall\, x\in[a,b].$$

$\textbf{Edit:}$

As mentioned in the comment, only concavity was enough. This is just an elaboration of the comment of Héhéhé.

If $f$ were concave on $[a,b]$, then for every $t\in [0,1]$, we would have had $$f(a(1-t)+bt) \geq (1-t) f(a) + tf(b).$$

Simply integrating the above with respect to $t$ from 0 to 1 along with the substitution:

$$a(1-t)+bt = \theta,$$ on LHS, we would have arrived at

$$\int\limits_{a}^{b} f(\theta) \, d\theta \geq (b-a) \frac{f(a)+f(b)}{2}.$$