Let $f:[0,1] \rightarrow \mathbb R^+$ be a continuous positive non-increasing function such that $f(1)=0$ and
$$\int_0^1 f(x)/(1-x)^a dx$$ exists for all $0\leq a \leq 1$ but does not exist for any $a>1$. Does that inform us on the (worst/best) rate of convergence of $f$ to $0$? Or maybe on the convexity/concavity of the function $f$?
I feel like there should exists a $t\in [0,1)$ such that the graph of $f$ after that point $t$ should always be above (or equal to) the graph of $1-x$ to satisfy the convergence property above, and since $f$ is non-increasing, that would make $f$ concave. Is my intuition correct? How to prove such statement?