Given quadratic equation $x^2+px+q+1=0$ with two distinct roots $x_1$ and $x_2$.
If $p$ and $p^2+q^2$ is prime numbers, what is the largest possible value from $x_1^{2014}+x_2^{2014}$?
My attempt: First of all, I think I need to find the formula for the simplest one like the sum and the product of its roots.
By using Vieta formula, I can get:
$$x_1+x_2= -p$$
$$x_1\times x_2= q+1$$
Then, I think I need to find another form like quadratic and cubic to help me to find the correct pattern for the desired question.
By using the algebra identity, I'll be able to find out that:
$$x_1^2+x_2^2=(-p)^2-2(q+1)$$
And $$x_1^3+x_2^3=(-p)^3-3(-p)(q+1)$$
Since the question is about 2014th degree, I think by using two identities is enough to find the pattern.
By setting the question, I got:
$$(x_1^{1007})^2+(x_2^{1007})^2= (x_1^{1007}+x_2^{1007})^2-2(x_1\cdot x_2)^{1007}$$
My question:
How can I get the value of $(x_1^{1007}+x_2^{1007})$? And also what is the useful for the info such that $p$ and $p^2+q^2$ are prime?
Thanks
Edit: p and q are also integers. The nature of its roots is whole number.
Suppose p is odd. From $p^2 + q^2$ prime we have $p^2 + q^2=2$ or $p \not = q \ \text{mod} \ 2$. Because p prime then $|p| > 1$ so we cannot have $p^2 + q^2=2$. Therefore $p \not = q \ \text{mod} \ 2$ and, because p is odd, q must be even and $q + 1$ must be odd. Now from: $$ x_1^2+px_1+q+1=0 $$ we have $x_1(x_1+p)$ is odd so $x_1$ is odd and $x_1+p$ is also odd. Therefore the difference $x_1 + p - x_1$ must be even, and this is a contradiction with initial supposition that p is odd.
Conclusion: $p=2$ or $p=-2$. This case, the discriminant is $4 - 4(q+1) = -4q$, therefore $q \lt 0$.
Because $x_1, x_2$ are whole numbers, we have $x_1x_2 \ge 0$ so $q+1 \ge 0$.
It follows that $-1 \le q \lt 0$ so $q=-1$. The solution: $x_1=0, x_2 = 2$ gives $x_1^{2014} + x_2^{2014} = 2^{2014}$