Not using $f'(x) = A''(x)$, and that $A''(x) >0$ means convex,
Show $A(x) = \int_{0}^{x}f(t)dt$ is convex if $f(x)$ is increasing.
This is from Apostol Calculus Vol. 1, Theorem 2.9 Pg. 122. My try is given
For a function to be convex in $[a,b]$ we need $\forall \alpha \in (0,1)$
$$f(\alpha b + (1-\alpha) a) < (\alpha) f(b) + (1-\alpha) f(a)$$
Using this, we need to show $A(\alpha x) < \alpha A(x)$
Now $A(\alpha x) = \int_0^{\alpha x} f(t) dt = \alpha \int _{0}^{x} f(\alpha t) dt < \alpha \int_{0}^{x} f(t) dt = \alpha A(x)$ because $t > \alpha t \implies f(t) > f(\alpha t)$.
Is this proof alright, and how can I write it more properly. I am self learning calculus and will then proceed to real analysis.
$A(\alpha x) < \alpha A(x)$ does not tell you that $A$ is concave upward. What you have to show is: $$A(\alpha x +(1-\alpha) y) \leq \alpha A(x) +(1-\alpha) A(y)$$ for $x<y$ and $0 < \alpha <1$. To prove this write the required inequality in the form $$\alpha ([A(\alpha x +(1-\alpha) y) - A(x)] \leq (1-\alpha) [A(y)-A(\alpha x +(1-\alpha) y)]$$. This becomes $$\alpha \int_x^{\alpha x +(1-\alpha) y} f(t) \, dt \leq (1-\alpha) \int_{\alpha x +(1-\alpha) y}^{y} f(t)\, dt$$. Make the change of variable $s=\alpha (t-x)$ on the left side to get $\int_0^{w} f(x+\frac s {\alpha}) \, ds$ where $w=\alpha (1-\alpha) (y-x)$. Similarly write the right hand side as $\int_0^{w} f(y+\frac s {1-\alpha})\, ds$. Finally, all that remains is to verify that $$f(x+\frac s {\alpha})\leq f(y+\frac s {1-\alpha})$$ whenever $s$ lies in the interval $(0,w)$. Since $f$ is increasing we only have to show that $x+\frac s {\alpha}\leq y+\frac s {1-\alpha}$ whenever $s$ lies in the interval $(0,w)$. I leave it to you to verify this simple fact.