I would appreciate any pointer or help with the following random walk. Consider a $k\times n$ $0$-$1$ matrix A, in which the following operation is performed repeatedly:
- Two different columns $C_i$ and $C_j$ and a random bit $b$ are selected uniformly at random.
- Column $C_i$ is updated to $C_i+C_j+b \mod 2$ (i.e. for any row $r=1...k$, $A_{ri}\leftarrow A_{ri}+A_{rj}+b$.
My main questions are:
a) How many steps are needed to end up with a matrix which is close (in terms of the total variation distance) to being uniform?
b) For $k=1,2$, $O(nlogn)$ steps are sufficient to approach uniformity. However, for larger values of $k$ straightforward independence is highly unlikely as the operation acts simultaneously on all rows... Hence, what is the largest value of $k$ for which closeness to uniformity can be achieved?
The question is essentially one of the mixing rate of a random walk on a graph whose vertices are all possible $k$-tuples of distinct $n$-bit strings (the rows of the matrix), and whose edges are induced by the operation described above.
Thank you for your time on this. I remain available for any questions you might have.
TD