I often come across these kinds of problems in A-level exam papers:
Prove that $$ \begin{vmatrix} (a+b)^2 & 1 & 1 \\ a^2 & (1+b)^2 & a^2 \\ b^2 & b^2 & (1+a)^2 \end{vmatrix} = 2ab(1+a+b)^3 $$
or
Prove that $$ \begin{vmatrix} 1 & 1 & 1 \\ x & y & z \\ yz & xz & xy \end{vmatrix} = (x-y)(y-z)(z-x) $$
Questions which involve nice determinant results, but which are quite a pain to prove by expanding.
Are there any tricks that one can use to prove such results? I'm familiar with Vandermonde matrices for example, but I haven't come across anything that might help me with these, especially the first one.
The idea is to use elementary row-column operations to have a simpler determinant. Since this is $3\times 3$, our goal is to get two successive zeros in a single row/column.
You have
$$A=\begin{bmatrix}(a+b)^2 & 1 & 1 \\a^2 & (1+b)^2 & a^2 \\b^2 & b^2 & (1+a)^2 \end{bmatrix}$$
Then
\begin{align} \det A&=\det\begin{bmatrix}(a+b)^2-1 & 0 & 1 \\0 & (1+b)^2-a^2 & a^2 \\b^2-(1+a)^2 & b^2-(1+a)^2 & (1+a)^2 \end{bmatrix}\qquad\quad[C_j'=C_j-C_3\,,j=1,2] \\\\&=\det\begin{bmatrix}(a+b+1)(a+b-1) & 0 & 1 \\0 & (a+b+1)(1+b-a) & a^2 \\(a+b+1)(b-1-a) & (a+b+1)(b-1-a) & (1+a)^2 \end{bmatrix} \\\\&=(a+b+1)^2\det\begin{bmatrix}a+b-1 & 0 & 1 \\0 & 1+b-a & a^2 \\ b-1-a & b-1-a & (1+a)^2 \end{bmatrix} \\\\&=(a+b+1)^2\det\begin{bmatrix}a+b-1 & 0 & 1 \\0 & 1+b-a & a^2 \\ -2a & -2 & 2a \end{bmatrix}\qquad\qquad[R_3'=R_3-(R_1+R_2)] \\\\&=\frac{2(a+b+1)^2}{a}\det\begin{bmatrix}a+b-1 & 0 & 1 \\0 & a+ab-a^2 & a^2 \\ -a & -a & a \end{bmatrix}\qquad\qquad\quad[C_2'=aC_2] \\\\&=2(a+b+1)^2\det\begin{bmatrix}a+b-1 & 0 & 1 \\0 & a+ab-a^2 & a^2 \\ -1 & -1 & 1 \end{bmatrix} \\\\&=2(a+b+1)^2\det\begin{bmatrix}a+b & 1 & 1 \\a^2 & a+ab & a^2 \\ 0 & 0 & 1 \end{bmatrix}\qquad\qquad\qquad[C_j'=C_j+C_3\,,j=1,2] \end{align}
Now expand with respect to the third row.