As a linguist I sometimes have a hard time trying to figure out the sense in which mathematical concepts are / might be used in my discipline. It is no wonder that we linguists are fascinated by the invention of the alphabet as a seminal milestone in Human History. However, mathematics plays around with the concept in a manner I would like to fully grasp. Thus, for instance, I find what I render a useful (and probably standard) definition can be found in
Johannes A. Buchmann Introduction to cryptography, New York, Springer, 2001. p. 73:
“To write texts, we need symbols from an alphabet. By an alphabet we mean a finite nonempty set $\Sigma$. The length of $\Sigma$ is the number of elements in $\Sigma$. The elements of $\Sigma$ are called symbols or letters.”
“Because alphabets are finite sets, their symbols can be identified with nonnegative integers. If an alphabet has length $m$, then its symbols are identified with the numbers in $\mathbb Ζ_m = \{0,1,\ldots, m-1\}$
Then he proceeds to elaborate on that in a fashion which will be too trivial for our audience here.
The fact is that, while reading some popular science book in Spanish about Kolmogorov, in the chapter about Kolmogorov complexity, after introducing all the typical concepts (entropy, mutual information and the like), they mentioned that Kolmogorov and his collaborators extended their ideas about finite alphabets (for a linguist the feature 'finite' seems substantial to the definition of alphabet) to what they called infinite alphabets. I have been searching on the web, in English, German and Spanish, but did not find any insightful or useful comment in this regard. Could someone explain me what an infinite alphabet is and, specifically, how Kolmogorov complexity, epsylon-entropy and so on ought to be understood for infinite alphabets?
Many thanks in advance.
Well I guess Shannon and Kolmogorov first started with defining the capacity or entropy of binary channels (i.e. messages over the alphabet {0,1} with some Bernoulli measure), but as always later on mathematics took over to generalize this as much as possible while keeping the important properties and results about entropy.
Nowadays Kolmogorov(-Sinai) entropy is a very important tool in dynamical systems and ergodic theory and the Gauss map is just one example of a dynamical system which has many applications also to number theory etc. The thing is that it should be treated using a countable partition (just because of the map having a countable number of surjective branches) and this can be done by the generalized theory of Kolmogorov entropy treating countable alphabets.