Let $$ A=\begin{bmatrix} a & b & 1 \\ b & 1 & b \\ 1 & a & a \\ \end{bmatrix} $$ Find $\det(A)$ in terms of $a$ and $b$, and write down the condition on $a$ and $b$ when $A$ is invertible. Simplify your answers.
I manage to find $\det(A)$ to be $a^2(1-b) + b^2(1-a) + ab -1$
For $A$ to be invertible $\det(A)$ cannot be equal to $0$, but I can't solve this equation for the condition of $a$ and $b$.
You correctly found the determinant.
Expanding it out, we have $\det(A) = -a^2b + a^2 - ab^2+ab+b^2-1$
In the case that $a=1$, this is $-b+1-b^2+b+b^2-1=0$
This implies that $(a-1)\mid \det(A)$
Similarly, if $b=1$ we have $\det(A)=-a^2+a^2-a+a+1-1=0$, so also $(b-1)\mid \det(A)$
Since $\gcd((a-1),(b-1))=1$ this implies that $(ab-a-b+1)\mid \det(A)$
Applying polynomial long division, we arrive at the factorization:
Which implies that $A$ is invertible if and only if:
which is equivalent to saying
As mentioned above, the first two conditions could be spotted by inspection by noting that it would cause the first and third columns or the second and third columns to be identical respectively which implies a zero determinant.