When I toss a coin 99 times, and it's heads every time, the mathematical probability of generating a series of 100 heads stays 50%, because each toss is unrelated to the previous ones. And I fully understand this.
But on the other hand, my life has a limited timespan, and I can't spend my whole life flipping coins. So let's say I spend a few weeks of my total life doing that on a high frequency, even then the probability of me witnissing a series of 100 during that time is much smaller then of observing a series of 99. All because I have limited amount of tries.
What I don't understand about statistics / probability theory is that they never seem to account for the above. They just say that the chance of a series of 100 is 50%, and for me that seems only true for the average of all people, because together they will have almost infinite tries. But for one single observer to witniss such an unique event, the chance feels lower.
So am I correct that the chance is lower for ME to make a series of 100 than it would have been according to math for the average person?
No, you are not correct. The chance for a heads given 99 heads is always going to be 50%, regardless of who throws it. The 'average person' does not have an infinite amount of tries. While it is more likely that someone other than you will get 100 heads than it is that you will get 100 heads, that is just because there are many people other than you. Each of them have precisely the same chance.
You argue that "The probability of me witnessing a series of 100 during that time is much smaller then of observing a series of 99."
Of course, it is half as large. But it is half as large for every other person who goes on mad coin-flipping sprees as well. Why would your flipping the coin improve your chances over someone else flipping it?
I feel I wasn't really able to understand and answer your question so far, so please comment if you feel I'm missing something.