If a, b, and c are real, and w is a non real cube root of unity...

188 Views Asked by At

If a, b, c, and d are real numbers and w is a non-real cube root of unity, and if

1/(a + w) + 1/(b + w) + 1/(c + w) + 1/(d + w) = 2/w

Prove that

1/(a + w^2) + 1/(b + w^2) + 1/(c + w^2) + 1/(d + w^2) = 2/w^2

Note -

I know that one way of proving this is by taking conjugate on both the sides, as conjugate of w = w^2. But I was wondering if it can be done without the use of complex conjugate.

Please note - This is certainly a homework question - but as I have already attempted it a lot of times and have already presented a possible solution - I think I am eligible to take a little help. Thank you!

Exact question here - https://i.stack.imgur.com/6d4p2.jpg