I'm a first-time Calc I student currently struggling in class. Yesterday we started on Substitution and Integration with integrals. One problem our professor put on the board was:
$$\int \frac{\left(x^2+2.1x\right)}{\left(x^3+3x+12\right)^6}dx$$
And he refused to solve it, said nobody in the room would be able to, and that he himself (a Ph.D) had no idea where to start solving it. However, just before that question, we had one that was almost entirely identical except for a decimal point:
$$\int \frac{\left(x^2+2x\right)}{\left(x^3+3x+12\right)^6}dx$$
with $u = x^3+3x^2+12$
and $du = (3x^2+6x)dx$
Eventually, we got to the answer of $\frac{-1}{15\left(x^3+3x^2+12\right)^5}+C$ (if anyone wants me to edit in the complete steps of the answer, please let me know).
So my question is: why does the small decimal of .1 make the otherwise-identical problem so much harder to complete, when really, the difference is a relatively tiny amount? Was our professor exaggerating?
If you like, you can think about it as follows: $$ \int \frac{x^2+2.1x}{\left(x^3+3x+12\right)^6}dx = \int \frac{x^2+2x}{\left(x^3+3x+12\right)^6}dx + 0.1 \int \frac{x dx}{\left(x^3+3x+12\right)^6} $$ the left integral is easy, and the right is the extra added part by your $0.1$ addition. The only difference is whether you can take it or not...