Let $A$ be an $n\times n$ matrix and $i,j,k$ be $1\leq i,j,k\leq n$ and $\alpha,\beta \in \mathbb{R}$.
I am supposing that $\bf{a}_k$(the $k$-th row) is equal to $\alpha \bf{a}_i+\beta \bf{a}_j$. ($\bf{a}_i, \bf{a}_j \in \mathbb{R^n}$ mean the $i$-th row and $j$-th row respectively).
So I need to prove that $\det(A)=0$
I know that $\det(A)=0$, when two rows are equal, and that if we add a row of $A$ multiplied by a scalar to another row of $A$, then the determinant will not change. I'm just having trouble interpreting the proof.
Start with $A$. Let $B$ be the matrix obtained by adding $-\alpha$ times row $i$ to row $k$ and $-\beta$ times row $j$ to row $k$. Then $\det A = \det B$, but row $k$ of $B$ is all zeros. Thus $\det B = 0$.
Here is an illustration: row operations can be carried out by elementary matrices. For instance, starting with the matrix $\begin{bmatrix}1& 1& 1 \\ 1& 0 &1 \\ 0& 0 &1 \end{bmatrix}$ if you add $3$ times row 1 to row 2 you get $\begin{bmatrix}1& 1& 1 \\ 4& 3 &4 \\ 0& 0 &1 \end{bmatrix}$. This is carried out by the elementary matrix that performs the same row operation on the identity: $$ \begin{bmatrix}1& 0& 0 \\ 3& 1 &0 \\ 0& 0 &1 \end{bmatrix} \begin{bmatrix}1& 1& 1 \\ 1& 0 &1 \\ 0& 0 &1 \end{bmatrix} = \begin{bmatrix}1& 1& 1 \\ 4& 3 &4 \\ 0& 0 &1 \end{bmatrix}.$$ It is easy to check that this elementary matrix has determinant $1$ (having only one nonzero term off the main diagonal that consists of $1$'s) so $$\det \begin{bmatrix}1& 1& 1 \\ 1& 0 &1 \\ 0& 0 &1 \end{bmatrix} = \det \begin{bmatrix}1& 1& 1 \\ 4& 3 &4 \\ 0& 0 &1 \end{bmatrix}.$$