I find Borel's theorem which asserts that almost all real numbers are normal very counter intuitive. If I think about the interval [0,1) and I imagine the number represented by infinite decimal digits [lets assume base 10] 0.xxxxxxx..... where each x is one digit of the infinite decimal representation of the number in the interval. (x is a digit from 0..9)
Then it seems very obvious to me that most of the numbers in the interval do not have the distribution properties of a normal number (like all digits 0..9 show up 1/10th of the time, 00..99 show up 1/100 of the time,etc).
See, a simple experiment, using lets say only 10 decimal digits 0.xxxxxxxxxx and filling up the decimal digits with all possible values we get the numbers 0.00000000001, 0.00000000002, etc until 0.9999999999. By simple counting, we see that the numbers who have the property of being normal are a very small subset of all possible numbers (they will only be the permutations of 0.0123456789).
This is even easier to see if we take base 2 for instance. Suppose that we experiment with all possible combinations of 2 digits in base 2. We have 0.xx, so we have 4 possibilities: 0.00, 0.01, 0.10, 0.11
Only half of the numbers (0.01 and 0.10) 'are on the way to produce a normal number'.
I know that to be normal we need to have infinite number of digits but I can not see how making the number of digits infinite will overcome this phenomenon described above. That is, 0.xxxxxxx..... should not have nicely distributed digits most of the time.
Could you please point where i am wrong?
Let us stay with the binary case. You stopped at the second digit with you analysis this is a bit very early.
Let's consider $5$ digits. You then have one, with all ones and zero, resp., and five where the count of digits breaks four against one. Fo the rest it breaks two/three. And I would say the latter are as much on the way to becoming normal (in base two) as possible. These are already $24/32= 3/4$ of all. You see things progress.
Now, if you continue you will see this gets ever stronger, if you do not insist that the counts of $1$'s and $0$'s are exactly the same or as close as possible to each other (in the case the number is odd), but always increase the margin of error you allow as you increase the number.
But, this is perfectly reasonable, since the defintion of normal numbers means that asymptotically the frequencies are the same, that is if you denote the frequencies among the first $n$ digits by $f_0(n)$ and $f_1(n)$ then you only want that $f_0(n)/n$ tends to $1/2$ as $n$ to infinity. (This is not the full defintiion of normal, but what it means for the individual digits, in base two, to get the idea acroos.)
So for instance if you have a number, say, with $f_0(n)$ about $n/2 + 100 n^{3/4}$ then this is still sufficient for it to fulfill the condition one has for normal.
You could try to do the caculation I did above for $5$ with a couple of other values, to see the effect.