A question goes;
Two parabolae have the same focus. If that their directrices are the x-axis and y-axis respectively, then find the slope of their common chord.
To solve it, you write out the general equations $$(x-a)^2=4(\frac{1}{2}b)(y-\frac{b}{2}) \\(y-b)^2=4(\frac{1}{2}a)(x-\frac{a}{2})$$ Where a and b are constants.
Subtracting these give you $x^2-y^2=0$, or $(x+y)(x-y)=0$.
It can be shown that at one time, two parabolae satisfying these properties can only intersect at two points. So the slope of the common chord is either 1 or -1, as obtained just above.
The problem is, that seems to be a result that should depend on the signs of a and b. I played around with this applet for a while, and that's how it plays out.
If I had a specific a and b, say 3 and 5, the solution would have one spurious root; $x+y=0$. But I can't work out a reason for why I would get that root too when I carry out the same steps with these two specific parabolae. Similarly, negative values of a and b would mean $x-y=0$ is a spurious root.
How exactly do these invalid roots end up in this analysis? A correct, rigorous treatment would work out so that the result you obtain specifies each root to be valid in specific intervals of a and b, as they are. Where did I mess up in my attempt here?
Is the answer not slope=1? There should be infinite answers for x and y at intersection points since there are an infinite number of equations that satisfy the conditions. A slope of 1 intuitively makes sense as well considering the question is forcing symmetric vertices across the y=x (or y=-x) line along with perpendicular directrices.
Edit: The slope could very well be -1; depending on which quadrant the focus lies in.