Let's say I have the following matrix $A$
$$A = \begin{bmatrix} 1 & a & a^2 \\ 1 & b & b^2 \\ 1 & c & c^2 \\ \end{bmatrix}$$
And say I want to find the determinant of the following matrix, using only the property that the determinant is a linear function of the rows/columns of $A$
Then
$$\begin{align}\det(A) &= \begin{vmatrix} 1 & a & a^2 \\ 1 & b & b^2 \\ 1 & c & c^2 \\ \end{vmatrix} \\ &= a\begin{vmatrix} \frac{1}{a} & 1 & a \\ 1 & b & b^2 \\ 1 & c & c^2 \\ \end{vmatrix} \ \ \ \ \text{(Step 1)}\\ &= ab\begin{vmatrix} \frac{1}{a} & 1 & a \\ \frac{1}{b} & 1 & b \\ 1 & c & c^2 \\ \end{vmatrix} \ \ \ \ \text{(Step 2)}\\ &= abc\begin{vmatrix} \frac{1}{a} & 1 & a \\ \frac{1}{b} & 1 & b \\ \frac{1}{c} & 1 & c \\ \end{vmatrix} \ \ \ \ \text{(Step 3)}\\ \end{align}$$
Now what I've done in Steps $1,2,3$ is the following
$ \text{(Step 1)}: L\left(a \cdot \langle{a_1}^{T}\rangle\right) = a\cdot L\left(\langle{a_1}^{T}\rangle\right)$
$ \text{(Step 2)}: L\left(b \cdot \langle{a_2}^{T}\rangle\right) = b\cdot L\left(\langle{a_1}^{T}\rangle\right)$
$\text{(Step 3)}:L\left(c \cdot \langle{a_3}^{T}\rangle\right) = c\cdot L\left(\langle{a_1}^{T}\rangle\right)$
But then by performing these steps, does that not imply the following:
$$\begin{align} abc\begin{vmatrix} \frac{1}{a} & 1 & a \\ \frac{1}{b} & 1 & b \\ \frac{1}{c} & 1 & c \\ \end{vmatrix} = \underbrace{\begin{vmatrix} bc & abc & a^2bc \\ \frac{1}{b} & 1 & b \\ \frac{1}{c} & 1 & c \\ \end{vmatrix}}_{ abc\cdot L\left(\langle{a_1}^{T}\rangle\right) = L\left(abc \cdot \langle{a_1}^{T}\rangle\right)} = \underbrace{\begin{vmatrix} 1 & a & a^2 \\ 1 & b & b^2 \\ 1 & c & c^2 \\ \end{vmatrix}}_\text{By initial expression} \end{align}$$
EDIT: Contrary to what I initially believed, this last identity is correct, an easy way to check is to multiply the elements of the first row by their respective minors in the second and third parts of the identity to see that they are the same.
Essentially I'm having trouble finding a concrete understanding of the determinant as a linear function of the rows/columns of a matrix. In every textbook (incl Introduction to linear Algebra by Strang), the treatment of determinants as a linear function of the rows/columns of a matrix is done in a very hand-wavy way with a plethora of examples, but never a concrete definition of it.
If someone could give a concrete definition of a determinant as a linear function of the rows/columns of $A$ and use the example I showed above to show where my misconceptions lie, it would be greatly appreciated.
To compute this Vandermonde determinant, simple reasoning is enough.
First, the expression must be a cubic polynomial in $a,b,c$, because every term in the development is a product of powers $0,1$ and $2$ (for a similar $n\times n$ matrix, you would have a polynomial of degree $0+1+\cdots n-1=n(n-1)/2$).
Then, whenever two parameters are equal, the determinant cancels. The only cubic polynomials that fulfill this are
$$\lambda(a-b)(b-c)(c-a).$$
Then the main diagonal yields a term $1\cdot b\cdot c^2$, which only appears when $\lambda=1$.
More generally, the determinant is the product of all $n(n-1)/2$ pairwise differences.