The quadratic equation is $$ (a^2+b^2)x^2 - 2b(a+c)x + (b^2+c^2) = 0 $$ $a$, $b$, and $c$ are non-zero, real and distinct numbers. This equation has non-zero real roots $(D \geq 0)$. One of the roots of the above equation is also the root of which equation from the following:
A) $$(b^2-c^2)x^2 + 2a(b-c)x - (a^2) = 0$$ B) $$(b^2+c^2)x^2 - 2a(b+c)x + (a^2) = 0$$ C) $$(a^2)x^2 + a(c-b)x - (bc)=0$$ D) $$(a^2)x^2 - a(b-c)x + (bc)=0$$
Using $D \geq 0$, I got $b^2=ac$, and substituting the value of $b^2$ in the original equation, I get : $$ax^2 - 2bx + c = 0$$
How should I proceed further?
As an alternative to the fine method given in the comments, as you noticed the determinant for the given equation is
$$\Delta =-4(ac-b^2)^2 \ge 0 \implies b^2=ac$$
which leads to
$$(a+c)(ax^2-2\sqrt{ac}x+c)=0 \implies ax^2-2\sqrt{ac}x+c=0 \implies (\sqrt a x-\sqrt c)^2=0$$
that is $$x=\frac{\sqrt c}{\sqrt a}=\frac b a=\frac c b$$
from which we see that only case C is compatible.