If $f(x)$ is continuous in $[0,\infty)$, and $\displaystyle\int_c^{\infty}\frac{f(x)}{x}\, dx$ is convergent for any $c>0$. Please prove $\displaystyle\int_0^{\infty}\frac{f(\alpha x)-f(\beta x)}{x}\, dx = f(0) \ln \frac{\beta}{\alpha}$.
This is my solution:
Suppose $\displaystyle\int_A^{B}\frac{f(\alpha x)-f(\beta x)}{x}\, dx=\int_A^B \frac{f(\alpha x)}{x} \, dx-\int_A^B \frac{f(\beta x)}{x} \, dx$.
And then we can suppose $u=\alpha x$ and $u=\beta x$.
Thus $\displaystyle\int_{\alpha A}^{\beta B} \frac{f(u)}{u} \, du-\int_{\alpha A}^{\beta B} \frac{f(u)}{u} \, du$. Also we can let $A \rightarrow 0$ and $B\rightarrow \infty$. But I cannot prove the question using this method.
I mean is there something wrong with my method? Or my method cannot apply to the question?
Your method does not work for this question because when you split up the original integral into the difference of two integrals, those new integrals may not converge. As an example, take $f(x) = \frac{1}{1+x}$. The integral $$\int_0^\infty \frac{1}{x(1+\alpha x)}\,dx$$ (and similarly for $\beta$) does not converge because of the singularity at 0. However, the difference $$\int_0^\infty \frac{\frac{1}{1+\alpha x} - \frac{1}{1+\beta x}}{x}\, dx = \int_0^\infty \frac{\beta-\alpha}{(1+\alpha x)(1+\beta x)}\,dx$$ converges, because the singularity at 0 has been cancelled out by the subtraction.