Beginning algorithmic information theory

83 Views Asked by At

I was wondering if the stack-exchange community could provide me with some resources for beginning studies in algorithmic information theory, with a focus on how it intersects with computability, complexity, and logic.

I've come across a fair number of texts on information theory, but concepts such as Chaitin's constant and Kolmogorov complexity tend to take a back seat to the more standard information theory being studied.

An ideal reference (book, set of lecture notes, or what have you) would be one which develops the necessary information theory so we can talk about the concepts discussed on the Wikipedia pages linked above.

(As a related aside: I did find find this interesting text entitled Computability and Randomness which, at a brief glance, appears to talk about the relationships between recursion theory, descriptive set theory, and randomness as they pertain to computability. While not quite what I was looking for, it does "match the flavor" of what I'm hoping to find)

Bonus (warning, this might be a dumb question): is there any meaningful connection between information geometry and algorithmic information theory?