Show $A(x) = \int_{0}^{x}f(t)dt$ is convex if $f(x)$ is increasing.

474 Views Asked by At

Not using $f'(x) = A''(x)$, and that $A''(x) >0$ means convex,

Show $A(x) = \int_{0}^{x}f(t)dt$ is convex if $f(x)$ is increasing.

This is from Apostol Calculus Vol. 1, Theorem 2.9 Pg. 122. My try is given

For a function to be convex in $[a,b]$ we need $\forall \alpha \in (0,1)$

$$f(\alpha b + (1-\alpha) a) < (\alpha) f(b) + (1-\alpha) f(a)$$

Using this, we need to show $A(\alpha x) < \alpha A(x)$

Now $A(\alpha x) = \int_0^{\alpha x} f(t) dt = \alpha \int _{0}^{x} f(\alpha t) dt < \alpha \int_{0}^{x} f(t) dt = \alpha A(x)$ because $t > \alpha t \implies f(t) > f(\alpha t)$.

Is this proof alright, and how can I write it more properly. I am self learning calculus and will then proceed to real analysis.

2

There are 2 best solutions below

5
On BEST ANSWER

$A(\alpha x) < \alpha A(x)$ does not tell you that $A$ is concave upward. What you have to show is: $$A(\alpha x +(1-\alpha) y) \leq \alpha A(x) +(1-\alpha) A(y)$$ for $x<y$ and $0 < \alpha <1$. To prove this write the required inequality in the form $$\alpha ([A(\alpha x +(1-\alpha) y) - A(x)] \leq (1-\alpha) [A(y)-A(\alpha x +(1-\alpha) y)]$$. This becomes $$\alpha \int_x^{\alpha x +(1-\alpha) y} f(t) \, dt \leq (1-\alpha) \int_{\alpha x +(1-\alpha) y}^{y} f(t)\, dt$$. Make the change of variable $s=\alpha (t-x)$ on the left side to get $\int_0^{w} f(x+\frac s {\alpha}) \, ds$ where $w=\alpha (1-\alpha) (y-x)$. Similarly write the right hand side as $\int_0^{w} f(y+\frac s {1-\alpha})\, ds$. Finally, all that remains is to verify that $$f(x+\frac s {\alpha})\leq f(y+\frac s {1-\alpha})$$ whenever $s$ lies in the interval $(0,w)$. Since $f$ is increasing we only have to show that $x+\frac s {\alpha}\leq y+\frac s {1-\alpha}$ whenever $s$ lies in the interval $(0,w)$. I leave it to you to verify this simple fact.

4
On

Your proof does not hold for the same reasons Kavi Rama Murty brought up.

Assuming $f$ is continuous, using the fundamental theorem of calculus, what you showed is equivalent to the backward implication of:

Let $f$ be a function, differentiable over $I\subset \mathbb{R}$, then $f$ is convex in $I$ iff $f'$ is increasing over $I$.

Proof: let $c\in (a,b)$, using mean value theorem, we have: $\exists a'\in (a,c), b'\in (c,b)$ s.t. : $$ \frac{f(c)-f(a)}{c-a}=f'(a') \text{ and } \frac{f(b)-f(c)}{b-c}=f'(b') $$ Because $f'$ is increasing, $f'(a')\leq f'(b')$ and: $$ \frac{f(c)-f(a)}{c-a}\leq \frac{f(b)-f(c)}{b-c} $$ Which sometimes called the "slopes inequality" and is equivalent to $f$ being convex (see this post or this one ).