It might be very simple but I need a formal proof for accepting or rejecting the idea below.
Let g be a polynomial of the order of n given below $$ g(L)=1-\theta_1 L-\theta_2 L^2- \ldots -\theta_n L^n, \quad \theta_i \in \mathbb{R} $$ where some of coefficients (not $\theta_n$) are zero.
So I would like to know under condition above, is there any straightforward rules for roots of the polynomial?
For example I have tested $1-x^2$ and found that the roots are the same but the sign is different. Then, knowing one root is enough to know another one.
Thanks you for sharing me your ideas.
The expression $1-x^2$ can be easily factorized by the difference of squares method, which yields $(1-x)(1+x)$. In general $1-ax^2$ is factorizable to $(1-\sqrt a x)(1+ \sqrt a x)$, and that gives an easy way of finding the roots.
In general, polynomials where the only nonzero components are even powers of $x$ are called, not surprisingly, even. For these, the roots are symmetric around the origin, so knowing all positive or negative roots give the other ones. Conversely, polynomials where the only nonzero components are odd powers of $x$ are called odd. Their roots are also symmetric around the origin, and they automatically have a root at $x=0$. All functions (not just polynomials) can be decomposed uniquely in an odd and an even part.
Although this simplifies calculations, there is no way of algebraically finding the roots of all the polynomials of degree 5 and above. This has been shown by Galois.