Suppose we have an $n$ by $n$ chessboard and each square can be in two states: "on" or "off". At time $t=0$ all bits are off. Every second one square is selected at random uniformly out of the $n^2$ squares and the corresponding bit is flipped. How to define entropy of "on" bits in this process and show that it increases? This came out of a model of a gas in a container, where we divide the container using a grid. Each grid point is "on" or "off" iff there is a gas molecule in the corresponding square. It should increase on average right? Seems to me it could be related to the variance of the "on" squares but I am no expert in the field, any hints?
Thanks in advance!
A simple measure is just the number of configurations of the on bits. If $n$ bits are on there are $64 \choose n$ configurations they can be in, so you can say the entropy is $\log {64 \choose n}$. Initially $n$ increases rapidly because the chance you turn off an on bit is small. As $n$ approaches $32$ you will approach maximum entropy. There doesn't seem to be anything in the problem that involves the particular pattern of bits, so I wouldn't do anything more complicated.