I have this proof:
For $v_{1},...,v_{k} \in R^{n}$, and $\alpha \in R \setminus \{0\}, \beta \in R$ one can easily verify that \begin{align} \operatorname{rank}(v_{1},...,v_{k}) &= \operatorname{rank}(v_{1},\dots,v_{i-1},\alpha v_{i}, v_{i+1},\dots,v_{k})\\ & = \operatorname{rank}(v_{1},\dots,v_{i-1}, v_{i}, v_{i+1},\dots, v_{j-1},v_{j}+\beta v_{i},v_{j+1},\dots,v_{k}) \end{align}
Could someone explain this proof in laymans terms? It's suppose to conclude that the rank of a matrix is unaffected by row reduction.
It's useful to review the definition of rank.
The $n \times k$ matrix $M$ in your question can be viewed as a function from $\mathbb R^n$ to $\mathbb R^k$: you input a $n$-vector $W = \langle w_1,...,w_n\rangle \in \mathbb R$ thought of as a $1 \times n$ matrix, and you carry out the matrix multiplication $WM$ to obtain a $1 \times k$ matrix thought of as a $k$-vector $WM \in \mathbb R^n$.
The image of this function is the row space of $M$, it is a subspace of the vector space $\mathbb R^k$, and its dimension is, by definition, equal to the rank of $M$.
Row operation is equivalent to multiplying $M$ on the left by an invertible $n \times n$ matrix $E$: replacing $v_j$ with $v_j + \beta v_i$ is equivalent to replacing $M$ by the matrix $EM$ where $E$ has $1$'s down the diagonal, and has a $\beta$ in the $i,j$ entry.
Since the $n \times n$ matrix $E$ is invertible, the matrices $M$ and $EM$ have the exact same row space: the row space itself is unaffected by row operations. In particular, $M$ and $EM$ have the same dimension, and so the ranks of $M$ and $EM$ are equal.