In this problem: Intersection of conics using matrix representation , a situation arises where there are two matrices (for example:) $$Q_1 = \begin{bmatrix}1 & 0 & 0\\0 & -1 & 0\\0 & 0 & -2\end{bmatrix}$$ $$Q_2 = \begin{bmatrix}1/2 & 0 & 0\\0 & -1 & 0\\0 & 0 & -1\end{bmatrix}$$
and we must set $$det(\lambda Q_1 + \mu Q_2) = 0$$
Expanding this equation by hand is straight forward - we can group terms, rearrange, recognize the form, and solve: $$(\lambda + \frac{\mu}{2})(-\lambda-\mu)(-2\lambda-\mu) = 0$$
(Note that the expression is not always this "nice" - i.e. when there are less zero terms in the matrices).
However, I am trying to write code to do this. Even simplifying and setting $\lambda=1$, the determinant computation still involves a symbolic variable ($\mu$). Without resorting to a big symbolic manipulation library, is there a different way to solve this procedurally?
$\det(Q_1 + \mu Q_2) = \sum\limits_{k=0}^3a_k\mu^k$
$P=\sum\limits_{k=0}^3a_kX^k\in\mathbb{R}_3[X]$
You need to find the roots of $P$.
Since the polynomial is of degree $3$ or less, you can solve it using a general formula http://en.wikipedia.org/wiki/Cubic_function