Taylor polynomial of degree 2 of $e^{x^2+x}$

45 Views Asked by At

I want to find the Taylor polynomial of degree 2 of $e^{x^2+x}$ and this is what the answer should be:

$$e^{x^2+x} = e^{x^2}e^{x} = (1 + x^2 + O(x^4)) (1 + x + \cfrac{x^2}{2} + O(x^3)) = 1 + x + \cfrac{3x^2}{2} + O(x^3)$$

The thing is that I don't understand what happens after the last equals sign. If anyone would care to explain that would be much appreciated.

1

There are 1 best solutions below

0
On

You multiply the two factors with the usual rules for polynomials (distributivity/associativity), with the additional rules that $O(x^k)\cdot \alpha x^\ell = O(x^{k+\ell})$ and $O(x^k)\cdot O(x^\ell) = O(x^{k+\ell})$: $$ \begin{align} (1 + x^2 + O(x^4)) (1 + x + \frac{x^2}{2} + O(x^3)) = 1 + x + \frac{x^2}{2} + O(x^3) + x^2 \cdots \end{align} $$ except that you stop whenever you get a power bigger than what you get in any of the $O(\cdot)$ so far. This is why here I stopped the expansion and didn't include any $x^3$, $x^4$, or $x^5$: there is no point, as I have a $O(x^3)$ already which "swallows" all these terms. You can go further in the expansion, but in the end any $x^k$ term for $k\geq 3$ will disappear, since they are all hidden in the $O(x^3)$. Then, you gather the terms: $$ \begin{align} 1 + x + \frac{x^2}{2} + O(x^3) + x^2= 1+x +\frac{3}{2}x^2 + O(x^3) \end{align} $$