Before Shannon's 1948 paper, most people hadn't realized that entropy, information, and their thermodynamics are closely related. I believe the concept of complexity today is also reducible between fields of science.
Computational complexity is a well established mathematical framework. Complex network is a widely studied field in the physics of networks. Complex Matter and Complex Systems are also a common jargon in many fields of sciences.
I would like to learn about the generic definition of complexity, not just in the eyes of computational prescriptive. Is there such an effort to define complexity using communal terms? We do have the equivalence classes for computational complexity, but I haven't seen physicists using this kind of formalism as opposed to information theory and statistical physics.
My question boils down to whether there is a need or existing formalism to describe the complexity and how complex a system is?
Edit:
I found https://arxiv.org/abs/0903.2037 which is a nice introduction by Joseph Traub.
Traub, Joseph F. "A brief history of information-based complexity." Essays on the Complexity of Continuous Problems", European Mathematical Society (2009): 61-71.
Edit 2: Leonard Susskind’s lecture: the notion of complexity as a metric defined as the minimal number of universal logic/quantum gates needed to apply on the vacuum state or on state A to go to state B. So the relative complexity between two entities is the minimal path in Hilbert space. Which is rephrasing what “reduction” is in a physicist’s language. Since it’s associated with distances and volumes in phase space (space of possible configuration of the system) complexity induce some type of entropy.
Being somewhere in the wide and diffuse field of complex-systems science, I am not aware of any efforts to define complexity nor does anybody seem to feel a need for it. It’s more an “I know it when I see it” thing. For example, the 90-page review paper The Structure and Function of Complex Networks hardly ever uses the word complex outside of the title, and not at all in a way that could serve as a definition.
The main reason for this is that no strict or surprising consequence arise from complexity in general. You won’t have statements like: “If the system is complex, then X.” or “X, unless the system is complex.” If anything, you might make similar statements with better defined properties that align with complexity (say chaos, though defining that is already a problem). You may also sometimes find somebody using a very specific notion of complexity for specific objects, say time series, but that’s not what you are looking for.
Instead, the term complexity is more used to define a field of research. And while some people may jokingly bicker that their research is about truly complex systems, the exact classification does not really matter as long as the research gains useful insights.