Find polynomials $u(x)$ and $v(x)$ such that $(2x + 3) \cdot u(x) + (x^2 + 1) \cdot v(x) = 1$.
I am trying to use the Euclidean algorithm for integers as a model, but I keep getting stuck. So far I have:
$x^2 + 1 = \frac{x}{2}(2x + 3) - \frac{3}{2}x + 1$ which leads to
$2x + 3 = -\frac{4}{3}(-\frac{3}{2}x + 1) + \frac{13}{3}$
and then I get stuck. I feel like there's something obvious that I'm doing wrong here, but I have no clue what it is. Can anyone help? Thanks!
Calculation is wrong.
$$x^2 + 1 = (\frac{x}{2} - \frac{3}{4})(2x + 3) + \frac{13}{4}$$ $$(2x + 3) = (\frac{8}{13}x +\frac{12}{13})\frac{13}{4} + 0$$
You need to find quotient for $(2x +3)|(x^2 + 1)$, and so on.