This has been hurting my head for a while now....
If
$$
\det\begin{bmatrix}a&a^2&1+a^3\\b&b^2&1+b^3\\c&c^2&1+c^3\end{bmatrix}=0
$$
And
$$
\det\begin{bmatrix}a&a^2&1\\b&b^2&1\\c&c^2&1\end{bmatrix} ≠0
$$
Then show that $abc=-1$.
This has been hurting my head for a while now....
If
$$
\det\begin{bmatrix}a&a^2&1+a^3\\b&b^2&1+b^3\\c&c^2&1+c^3\end{bmatrix}=0
$$
And
$$
\det\begin{bmatrix}a&a^2&1\\b&b^2&1\\c&c^2&1\end{bmatrix} ≠0
$$
Then show that $abc=-1$.
On
Here is a solution that does not require expanding out the determinant:
Since the determinant of the first matrix is zero, there is a nontrivial linear combination of the columns which equals the zero vector. Thus, there exist $p,r,q\in\mathbb{C}$, not all $0$, such that $a,b,$ and $c$ are solutions to $$ p x^3+q x^2 + r x+p =0. $$ If we can show that $p\neq 0$ and that $a,b,$ and $c$ are distinct, then we're done, since in that case, $$ (x-a)(x-b)(x-c)=x^3+\frac{q}{p} x^2+\frac{r}{p} x + 1=0, $$ and so $-abc=1$, i.e. $abc=-1$. Now, if, without loss of generality, $a=b$, then the first two rows of the second matrix would be equal, and thus it would not have nonzero determinant. So $a,b,$ and $c$ are distinct. Similarly, if $p=0$, then $a,b,$ and $c$ all satisfy $q x^2 + r x+p =0$, which means that $q$ times the middle column of the second matrix plus $r$ times the first column plus $p$ times the last column would be the zero vector, which would force the determinant to be zero. So $p\neq 0$, and we're done.
On
First note that, $$\begin{vmatrix} a & a^2 & a^3+1 \\ b & b^2 & b^3+1 \\ c & c^2 & c^3+1 \\ \end{vmatrix} = \begin{vmatrix} a & a^2 & a^3 \\ b & b^2 & b^3 \\ c & c^2 & c^3 \\ \end{vmatrix} + \begin{vmatrix} a & a^2 & 1 \\ b & b^2 & 1 \\ c & c^2 & 1 \\ \end{vmatrix}=0$$ Then, $$abc\begin{vmatrix} 1 & a & a^2 \\ 1 & b & b^2 \\ 1 & c & c^2 \\ \end{vmatrix} = \begin{vmatrix} a & a^2 & a^3 \\ b & b^2 & b^3 \\ c & c^2 & c^3 \\ \end{vmatrix} = -\begin{vmatrix} 1 & a & a^2 \\ 1 & b & b^2 \\ 1 & c & c^2 \\ \end{vmatrix}$$
Therefore, $abc=-1$
The hard part is to compute the determinants and factor them nicely:
$ \det\begin{bmatrix}a&a^2&1\\b&b^2&1\\c&c^2&1\end{bmatrix} = (ab^2+bc^2+ca^2)- (b^2c+c^2a+a^2b) = (a-b)(b-c)(c-a) $
and
$\det\begin{bmatrix}a&a^2&1+a^3\\b&b^2&1+b^3\\c&c^2&1+c^3\end{bmatrix}$
$= (ab^2(1+c^3)+bc^2(1+a^3)+ca^2(1+b^3)) - ((1+a^3)b^2c+(1+b^3)c^2a+(1+c^3)a^2b)$
$= (ab^2+bc^2+ca^2) - (b^2c+c^2a+a^2b)$ $+ (ab^2c^3+bc^2a^3+ca^2b^3) - (a^3b^2c+b^3c^2a+c^3a^2b)$
$= \left[(ab^2+bc^2+ca^2) - (b^2c+c^2a+a^2b)\right] + abc\left[(bc^2+ca^2+ab^2) - (a^2b+b^2c+c^2a)\right]$
$= \left[(ab^2+bc^2+ca^2) - (b^2c+c^2a+a^2b)\right](1+abc)$
$= (a-b)(b-c)(c-a)(1+abc)$
The rest is easy.