$$\sum \varepsilon'\varepsilon = (y - X\beta)'(y-X\beta)$$
To estimate $\beta$, I understand how to proceed but I am confused as to what $'$ is, and why the error term least squares is equal to $\sum\varepsilon'\varepsilon$, where $y$ is the vector of observations, $X$ is an $n\times k$ design matrix, $\beta$ is vector of parameters, and the error term is observation minus the $\text{parameter}\times x$ term.
The prime notation is in this context is the vector/matrix transpose i.e. For a vector $$ v = \left(\begin{matrix} a \\ b \\ c \end{matrix}\right) $$ then $$ v' = \left(a\,\, b \,\,c\right) $$ Notice how I have gone from a coloumn to a row vector. For a matrix $$ A = \left[\begin{matrix} a & b \\ c & d \end{matrix}\right] $$ then $$ A' = \left[\begin{matrix} a & c\\ b & d \end{matrix}\right] $$ In this case I have reflecting about the diagonal line. Take a look at the $n\times m$ case instead of $n\times n$ as above.