Both Benford's Law (if you take a list of values, the distribution of the most significant digit is rougly proportional to the logarithm of the digit) and Zipf's Law (given a corpus of natural language utterances, the frequency of any word is roughly inversely proportional to its rank in the frequency table) are not theorems in a mathematical sense, but they work quite good in the real life.
Does anyone have an idea why this happens?
(see also this question)



As a rough/somewhat-intuitive explanation of why Benford's Law makes sense, consider it with respect to amounts of money. The amount of time(/effort/work) needed to get from \$1000 to \$2000 (100% increase) is a lot greater than the amount of time needed to get from \$8000 to \$9000 (12.5% increase)--increasing money is usually done in proportion to the money one has. Thought about in the other direction, it should take a fixed amount of time to, say, double one's money, so going from \$1000 to \$2000 takes as long as from \$2000 to \$4000 and \$4000 to \$8000, so the leading digit spends more time at lower values than at higher values. Because the value growth is exponential, the time spent at each leading digit is roughly logarithmic.