I am reading "Elements of Information Theory" by Cover and and I can't understand one paragraph, specifically the highlighted part:
"One of the basic laws of physics, the second law of thermodynamics, states that the entropy of an isolated system is nondecreasing. We now explore the relationship between the second law and the entropy function that we defined earlier in this chapter. In statistical thermodynamics, entropy is often defined as the log of the number of microstates in the system. This corresponds exactly to our notion of entropy if all the states are equally likely. But why does entropy increase? We model the isolated system as a Markov chain with transitions obeying the physical laws governing the system. Implicit in this assumption is the notion of an overall state of the system and the fact that knowing the present state, the future of the system is independent of the past. In such a system we can find four different interpretations of the second law. It may come as a shock to find that the entropy does not always increase. However, relative entropy always decreases."
How do I understand that? I know that entropy from theormodynamics and from information theory are related, but are not the same thing, but does it mean that the entropy from information theory doesn't obey the second law of thermodynamics?