(sorry if this is obvious or it has already been answered)
If you generate a lot of random (uniformly distributed) values between 0 and 1 and take the average, the answer gets closer and closer to 0.5. But can you actually say that the average converges after an infinite amount of numbers? I couldn't think of the answer to this, because I thought
- Since it's uniformly distributed, obviously the average should approach 0.5
- It includes randomness, so technically speaking, every number could be 0.7 and it wouldn't converge to 0.5
Yes, according to the law of large numbers the sample mean will approach the true expected value as the sample size increases.
Law of Large Numbers