Can we subtract a trigonometric term from a polynomial?

95 Views Asked by At

Can we find the root of a function like $f(x) = x^2-\cos(x)$ using accurate algebra or do we need to resort to numerical methods approximations?

thanks.

3

There are 3 best solutions below

4
On

The answer to your posed problem cannot be expressed in general in terms of elementary formulas in closed form. So in practice, numerical root finding is the only way to go. Luckily, for your question finding a root approximately is not that hard using Newton's method.

0
On

We can prove that the zeroes of $f(x) = x^2 - \cos(x)$ are transcendental using the

Lindemann–Weierstrass Theorem. Consider $\alpha_1,\dotsc,\alpha_n,\beta_1,\dotsc,\beta_n$ algebraic (over $\mathbb{Q}$). If $\alpha_1,\dotsc,\alpha_n$ are pairwise distinct and $\beta_1,\dotsc,\beta_n$ are non-zero, then $$ \beta_1 e^{\alpha_1} + \dotsb + \beta_n e^{\alpha_n} \neq 0 $$

Observe that $f(0) \neq 0$. Now suppose that $\alpha \neq 0$ is an algebraic number such that $f(\alpha) = 0$. Recalling that $\cos(x) = \frac{e^{ix}}{2}+\frac{e^{-ix}}{2}$ this means that $$ \alpha^2 \cdot e^0 - \frac{1}{2} \cdot e^{i\alpha} - \frac{1}{2} \cdot e^{-i\alpha} = 0 $$ contradicting the Lindemann–Weierstrass Theorem because $i\alpha \neq -i\alpha$ if $\alpha \neq 0$.


It is straightforward to adapt this argument to $f(x) = p(x) + t(x)$ with $p(x) \in \bar{\mathbb{Q}}[x]$ and $t(x) \in \bar{\mathbb{Q}}[\sin(x),\cos(x)] \setminus \bar{\mathbb{Q}} = \bar{\mathbb{Q}}[e^{ix},e^{-ix}] \setminus \bar{\mathbb{Q}}$. With a bit of care it should be possible to extend it to $t(x) \in \bar{\mathbb{Q}}(e^{ix},e^{-ix}) \setminus \bar{\mathbb{Q}}$, too.

Note: This argument doesn't (immediately) exclude zeroes which can be which can be written as a polynomial in transcendental numbers for which we we have symbols (like $\log(n)$ for $n \in \mathbb{Z}_{\geq0}$, $\pi$, or $e$).

2
On

As said in comments and answers, equations which contain both polynomial and trigonometric terms do not show analytical solutions and numerical methods are required.

A simple method is Newton iterative method : starting from a "reasonable" guess $x_0$, it will update it according to $$x_{n+1}=x_n-\frac{f(x_n)}{f'(x_n)}$$ In the case of the post $$f(x)=x^2-\cos(x)$$ $$f'(x)=2x+\sin(x)$$ By inspection,we can see that the positive root is somewhere between $\frac{\pi}{6}$ and $\frac{\pi}{3}$; so let us start iterating starting at the middle of the interval, that is to say $x_0=\frac{\pi}{4}$. So are generated the following iterates : $0.8250207908$, $0.8241327557$, $ 0.8241323123$ which is the solution for ten significant digits.

We could even start closer to the solution using Taylor since, built at $x=0$, we have $$x^2-\cos(x)=-1+\frac{3 x^2}{2}-\frac{x^4}{24}+O\left(x^5\right)$$ Solving the quadratic give $x =\sqrt{2 \left(9-5 \sqrt{3}\right)}\approx 0.824313$ which will make Newton method converging very fast.

Edit

Since A.P. asked the question in the comments, there are methods of higher order than Newton (which a quadratic convergence). Halley method shows a cubic convergence, Householder method shows a quartic convergence and the order can be increased as far as one could want. The problem is not that these methods require higher order derivatives but that the formula start to be much more complex.

To give an idea, starting with $x_0=\frac{\pi}{4}$, let note $x_1^{(n)}$ the value of the first iterate of the method of order $n$. They will be $$x_1^{(2)}=0.8250207908$$ $$x_1^{(3)}=0.8240879089$$ $$x_1^{(4)}=0.8241350550$$ $$x_1^{(5)}=0.8241321224$$ $$x_1^{(6)}=0.8241323264$$ $$\cdots$$ $$x_1^{(\infty)}=0.8241323123$$