So the question is as stated in the title. We are given the hint to use LDCT.
Since this is homework I'm not looking for an explicit solution. I just need hints.
For example, my first thoughts were to rewrite $f$ as,
$$ f(x) = \lim_{n \rightarrow \infty} \sum_{i = 1}^n f_i(x) $$
Then to use the linearity of the integral,
$$ \int_b^{\infty}f(x)dx = \lim_{n \rightarrow \infty} \sum_{i = 1}^n \int_b^{\infty}f_i(x)dx $$
I'm not sure how to show that the integral must go to zero as $b \rightarrow \infty$. I thought I would see a connection from the integrability of each $f_i$ and that somehow if each term didn't start to go to zero then we would reach a contradiction. But that doesn't seem right, now.
HINT: $|f|$ must be integrable, hence for every $\varepsilon>0$ there exists $b_\varepsilon$ such that $$ \int_{b_\varepsilon}^{\infty} |f(x)|\, dx<\varepsilon. $$