(One direction of) Shannon's channel coding theorem says that for a discrete memoryless channel, all rates below capacity $C$ are achievable. Specifically, for every rate $R < C$, there exists a sequence of $(2^{nR}, n)$ codes with maximum probability of error $λ^{(n)} \to 0$.
As I understand, $n$ is counting the number channel uses. Is this correct? If so, then the number $2^{nR}$ of input message is increasing also. What is the physical interpretation of this increase in messages?
You are correct.
The physical interpretation of $2^{nR}$ messages is: "messages writeable by nR bits".
Therefore the theorem claims that using long enough messages, we can get arbitrarily close to zero-mistakes in transmission, so long as we transmit in a rate which is less than the channel capacity. Where "transmit in a rate R" is translated as "pass X bits with X/R uses"
e.g. if C = 2, then we can (with arbitrary low error probability, for long enough messages) expect to transmit a bit for every use (R=1); but we would be real fools to try to pass over 3 bits with every use - can't be done (from the other direction of the theory).