I'm a little confused as to whether or not this question belongs here or on https://cstheory.stackexchange.com/, so please, bear with me.
I've been reading a few books on the concept of information, data, knowledge and wisdom, and the topic of MTC (Mathematical Theory of Communications) keeps coming up. I understand the theory, to an extent, but every single author claims that 'information theory' is the wrong label for Shannon's MTC.
The problem is, that no authors seem interested in revealing WHY this label is such a mismatch.
Is it just because calling MTC 'information theory' is too broad?
Does anyone know why known authours in information theory, like Luciano Floridi, have such a problem with this label?
Again, please be merciful if this is not the correct place to ask said question.
EDIT
Here's a link to one such case where the label is being criticized.
On that page.
It's the only one I can think of right now, but if anyone can take a stab at why this label is so unfitting in a single case, it would give me some insight into what authors mean.
Shannon's Information Theory is a fairly broad multidisciplinary subject primarily grounded in the fields of electrical engineering and mathematics with overlap in areas of computer science, physics, statistical mechanics, statistics and even philosophy. I've not run across any writers using the acronym MTC, and supposed (correctly after checking) that the author you were referencing was a philosopher. Because most researchers come at the topic from various different backgrounds without necessarily a firm grounding in all of the mathematics and science required to function at a high level there can often be differing viewpoints on what the subject is about (particularly when it comes to the philosophical viewpoints.) Usually one's thoughts on information theory are heavily flavored by their main area of expertise, so for example, a computer scientist can (and likely will) have a very different view of information theory than that of a mathematician or an electrical engineer.
A lot of general problems stem from basic definitions of what entropy is mathematically and how it is applied - and even more so in the areas of philosophy you're viewing it from. I haven't read it yet, but given the area of the text you mentioned, you might also benefit from taking a look at Traditions of Systems Theory which was released recently by Routledge and covers a broad range of the historical build up of the subject (particularly from a philosophical viewpoint). As a general caveat, be sure you delve into some of the mathematics behind the theory before becoming swamped in all of the semantics of the philosophy which can often swerve wildly from the black and white of the mathematics.
One of the primary problems that you're not seeing being stated is that information theory is a very young field (relatively speaking), and for lack of better terminology, researchers use words which make sense to them as the science grows. This can have adverse affects on future students and their understanding, particularly when a word like "information" has such a nebulous every day definition compared to how it is used in the math, physics, and engineering disciplines. As a related example consider the story of how Shannon (and John von Neumann) defined entropy, a fundamental quantity in information theory:
Many researchers may have fundamental problems with the broad definitions provided within the science, but for lack of anything better they're stuck with using what those have used before and what is common within the literature until there is a major paradigm shift and the definitions are rewritten and drastically clarified. I might suggest that one of those helping to clarify and concatenate information theory and it's definitions is Arieh Ben-Naim with his text A Farewell to Entropy which unifies the definitions of entropy in information theory and statistical mechanics. Given areas of application, it will likely be a while before the definition of "information theory" is unified under all of the umbrellas in which it is used. Until then, pay particular attention to the areas of training in which the user has studied to better grasp their particular definitions. In your particular case, philosophers will have a far broader view of the field than the mathematicians (or most others) will.
I might also suggest that one of the things you should be cautious of appears in the opening paragraphs of Shannon's original paper in which he very specifically states that he is uncoupling the math of information theory from its semantic meaning. Many philosophers you will encounter are attempting to specifically delve into the semantic meaning portion of the problem and attempt to re-couple the two concepts which presents many problems of definition and application which you may be bumping up against.