Kolmogorov complexity of a computer?

103 Views Asked by At

Warning: Vague, unclear question ahead. Proceed at your own risk.

The Shannon entropy and Kolmogorov complexity give you in broad informal terms how unpredictable a string is and to what degree the data in a string can be compressed, respectively. What I want to do is to broaden those terms to try to encapsulate the complexity of a certain computer, of a program; is there already some information-theoretic metric that's capable of that?

It seems to me that the answer is buried somewhere in algorithmic information theory, but I don't know. I think that one way to do it would be to work from possible input and output strings, to say that the conditional Kolmogorov complexity between the program's input and output strings is the real essence of the complexity of the computation it carries out. So basically, I think the real information contained by a computer might be somehow related to the matrix of conditional Kolmogorov complexities relating every possible input to every possible output. But that's a long (possibly infinite) and messy description, and you could never even approximate it for any practical computer. Or at least I'd have absolutely no idea how.

The point of all this is that I'm trying to just begin to get my head around formalizing the idea of damage to a computer, of exactly how that damage can lead to information loss/loss of computational ability. In really broad strokes, I want an entirely formal way to say that the array of semiconductors that forms a calculator is more complex than an array of semiconductors forming a calculator with a lower precision, less operations, and a lower overflow value, which is more complex than an array of dust particles incapable of any practical computation. Are the vague ideas I just outlined a reasonable way to think about that concept? Or is there an already-existing sort of metric that could save me time?