I was reading up on how the number of multiplications required to multiply two complex numbers can be reduced (substituting this reduction in multiplications by an increase in the number of additions and subtractions). It then went on to say that a similar algorithm could be used to multiply two univariate polynomials (ie: you can reduce the number of multiplications needed to multiply two univariate polynomials).
How can this be done?
By using $FFT^{-1}(FFT(a*b) ) = FFT^{-1}(FFT(a)FFT(b))$. Where $*$ is convolution, and $a,b$ are vectors of the two polynomials' coefficients. Since convolution becomes multiplication in the Fourier domain from $O(n^2)$ you reduce those to $O(n)$. However transforming your polynomial coefficient vector to the fourier domain is not free, but it can be done in $O(n\log n)$ time by using the fast fourier transform. Thus you need $O(n\log n)$ time to multiply two polynomials.
Edit: To use specifically Karatsuba's method that you have in mind you can 'split' your polynomial. As an example: $$p(x) = a_0 + a_1x+a_2x^2+...+a_nx^n = $$ $$(a_0 + a_1x) + x^2(a_2 + a_3x) + ... + x^{n-1}(a_{n-1} + a_nx)$$ Then perform the procedure recursively. You can make a split with polynomials with different powers too. Most people use the $FFT$ though.