Find the condition that the.diagonals of a parallelogram formed by $ax+by+c=0$, $ax+by+c'=0$, $a'x+b'y+c=0$ and $a'x+b'y+c'=0$ are at right angles.
My Attempt:
The equation of diagonal passing through the point of intersection of $ax+by+c=0$ and $a'x+b'y+c=0$ is $$(ax+by+c)+ K(a'x+b'y+c)=0$$ Where $K$ is any arbitrary constant.
Again, The equation of the diagonal passing through the point of intersection of $ax+by+c=0$ and $a'x+b'y+c'=0$ is $$(ax+by+c)+L(a'x+b'y+c')=0$$ Where $L$ is any arbitrary constant.
How do I complete the rest?
For two parallel lines in the $xy$-plane given by the equations
\begin{align*} Ax+By+C_1 = 0\\[4pt] Ax+By+C_2 = 0\\[4pt] \end{align*}
the distance between them is given by the formula
$$\frac{\left|C_2 - C_1\right|}{\sqrt{A^2+B^2}}$$
(see https://en.wikipedia.org/wiki/Distance_between_two_straight_lines)
But given a parallelogram,
$$\text{the diagonals are perpendicular}$$ $$\text{if and only if}$$ $$\text{the parallelogram is a rhombus}$$ $$\text{if and only if}$$ $$\text{the distance between the pairs of opposite sides are equal}$$
Applying the above to the lines specified for the edges of your parallelogram, you get
$$ \frac{\left|c - c'\right|}{\sqrt{a^2+b^2}} = \frac{\left|c - c'\right|}{\sqrt{(a')^2+(b')^2}}$$
which yields the condition
$$a^2+b^2 = (a')^2+(b')^2$$