If we apply Shannon's formula to a sequence of numbers that we know how to generate (for example, natural numbers), shouldn't entropy be 0?
Mine might be a too intuitive definition of entropy. But if something is predictable, does it have entropy at all? Notwithstanding this is the fact that it appears to have entropy. However, is '000000' any different from '012345'? According to online entropy calculators it is.