I'm actually not sure where to pose this question, but we do have an Information Theory tag so this must be the place. The "simple" question is in the title: how do I know how many bits of information is in the question: "How much information is in this question?"?
Or, for a simpler one, how many bits of information is in the sentence "The quick brown fox jumps over the lazy dog"?
To clarify, I'm not asking how much information is in it when read by someone, but rather how much information, in bits, does it have inherently? Intuitively, there should be an answer.
I was going over my old papers and stumbled about one where I attempted to measure how effective mnemonics are using information theory. I didn't get very far, but now the question's in my head again. Any links to other articles or journals would be appreciated as well.
There is in fact an existing mathematical definition of this exact concept called Kolmogorov complexity, but you will surely be disappointed because there is a serious catch: it is only defined up to a constant additive factor which depends on your model of computation. But aside from that factor, it is an invariant metric across a wide range of models, so even though we can't pin down the "inherent" Kolmogorov complexity of a particular string without reference to a particular model, if we have two infinite sequences of strings, we can compare them to see if one eventually becomes more complex than the other in a more broadly-applicable sense.