I'm a little puzzled about solving 1st order linear ODEs and determining when absolute values can be dropped. To give a specific example consider $y'+\frac{1}{t}y=t$. With the integrating factor approach we would be interested in $\mu(t)=e^{\int 1/t~dt}=e^{\ln|t|}=|t|$. I have seen some solutions that blithely drop the absolute value. For this particular problem, doing so yields an answer which readily can be checked does satisfy the ODE. But I can't see a justification for why I can drop the absolute values a priori.
If I was to continue with the integration factor method I would get to a place that looks like
$$\frac{1}{|t|}\int t|t|~dt=\frac{1}{|t|}\left( \frac{t^3 \operatorname{sgn}(t)}{3} + c \right)$$
and I can see how this is equivalent to
$$\frac{t^2}{3} + \frac{c}{t} $$
where there is a little fudging with the $c$ constant to get rid of the absolute value around the $t$ in that term's denominator. So in the end we arrived at the same result as if we had initially dropped the absolute values.
Here are my questions, is there a shortcut, or to be rigorous, do all the above steps need to be taken? If there is not a shortcut to a priori determine if the absolute values may be dropped, can you provide an example that illustrates an incorrect solution is obtained by dropping the absolute values?