The matrix is fairly messy to present, but quite easy to understand. When $n=3$, the matrix is
\begin{bmatrix} 1 & 0 & 0 & 1 & 0 & 0 & 1 & 0 & 0\\ 0 & 1 & 0 & 0 & 1 & 0 & 0 & 1 & 0\\ 0 & 0 & 1 & 0 & 0 & 1 & 0 & 0 & 1\\ 1 & 1 & 1 & 0 & 0 & 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 1 & 1 & 1 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0 & 0 & 1 & 1 & 1 \end{bmatrix}
So, basically, it splits into two parts:
the upper part (from row $1$ to row $n$) has $n$ identity matrices $I_n$.
the lower part is $O_n^1, \ldots, O_n^n$, where each $O_n^i$ is an $n \times n$ matrix whose $i$-th row is all one whereas the other entries are zero.
By some examples and intuition, I am aware that the rank of such a matrix is $2n-1$, but how should I rigorously prove it?
The $2n$th row is the sum of the first $n$ rows minus the sum of rows $n+1$ through $2n-1.$ Thus the rank is at most $2n-1.$
The last $n$ columns in the first $n$ rows and the $0$s in rows $n+1$ through $2n-1$ in the last $n$ columns, show that the only way to make a row of $n^2$ zeros a linear combination of the first $2n-1$ rows is to make the first $n$ coefficients in the linear combination equal to $0.$ But then you can't get $0$s in the first $n^2-n$ columns except by using $0$ as the $(n+1)$th through $(2n-1)$the coefficients. Hence the rows other than the last one are linearly independent.