We have variables $x^1, ..., x^r$. Each variable is a vector of dimension $n$. They must all be equal.
My textbook says that we will "represent" this condition by writing $\sum_{j = 1}^r H^j x^j = 0$ where $H= (H^1, ..., H^R)$ is a $k \times nr$ matrix.
I don't quite understand what is meant by this? So, what then exactly is this $H$ matrix? What does it look like? What is $k$?
This is a bit obscurely stated, but I think what's going on is this.
If you have two vectors $x$ and $y$ with $n$ coordinates, you can think of the vector equality $x=y$ as a system of $n$ linear equations (equality of each of their respective coordinates, namely $x_i=y_i$ for $i=1,\dots,n$). You can think generalize this to $r$ vectors, giving $r-1$ such sets of $n$ linear equations. Thus, you would end up with a system of $k=(r-1)n$ linear equations in $rn$ variables. This can, in turn, be written as a monstrous matrix equation where you think of the vector $x^j$ as having its coordinates $x^j_1,\dots,x^j_n$ in the $j$th "slot" of the giant vector with $rn$ coordinates and $0$'s elsewhere.