Lets suppose for absurd that I eliminate one number from the naturals. If I were supersticious I would eliminate number 13. Now imagine that to keep normal mathematics possible within such system this would produce something like a new operation rather than addition or multiplication, etc.
From a scenario like this one (which was totally made up) could we possibly arrive at a number system where through some algorithm (based on the new operation) the limits to information compression could be different?
I am considering in particular the problem of representing a set of non repeating natural numbers. I Know that there are limits to the bits for each number in this case, no matter what system, but still there is the issue of their concatenation.
The system without 13 would have some different theorems, or theorems replacing the usual ones. So I wonder what parts of our mathematical theory are compression limits attached to, to consider whether these are "replaceable" for hipothetical new ideas like the fictious system without 13.
I cannot make much sense of the question - especially of the third paragraph. The concepts of source encoding (information theory, first Shannon theorem, entropy, compression) are not related with "number representation" or "number system" (I still don't understand what you mean by your example of a "system without 13", but anyway...). The underlying model deals only with a source that emits symbols from some (finite or infinite) alphabet, with given probabilities. Once this model is known, the entropy rate is defined, and this gives you the standard limit for the average length of any uniquely decodable encoding.