I ran a simulation that went as follows:
The percentage chance of an event happening started at 0%.
In each iteration, 0.0103% was added to this percentage chance.
So, for example, on the 3rd iteration the chances of the event happening would be 0.0309%.
In my question, "percentage sum" is the sum of all the previous percentages at each iteration. At iteration 3, the "percentage sum" would be 0.0103 + (2 * 0.0103) + (3 * 0.0103) = 0.0618%
In the simulation, the event happened when the percentage sum was at around 86%. This confused me, because I figured the percentage sum should be definitely at less than 50% when the event occurred. My reasoning:
Let's say the chance of an event happening started at 0% and grew by 10% each day.
Day 1: 0% chance the event could happen. Day 2: 10% chance the event could happen. Day 3: 20% chance the event could happen. Day 4: 30% chance the event could happen.
Already at this point, it's likely the event should have already happened, yet the percentage sum is only 10+20+30 = 60%. So why in my simulation did the event happen at 86%?
NOTE: I ran the simulation 100 times and took the average, so this isn't some one odd instance.
Your percentage sum is in a sense, the "expected probability" that the event will happen in the first $n$ days. Well so when the event actually happen this number should be close to $100$%. Which $86$% is pretty close. I would say if you perform the experiment like $100000$ times it will be closer to $100$%.
One fundamental misleading intuition for human, which I myself experience too, is that one tends to think an event should happen when the net probability is greater than $50$%. While in reality $50$% only implies the event happening half of the times while $100$% is the real probability that an event should happen.
P.S: For reference, the program below which I wrote gives the average "percentage sum" to be $10081 \over 10000$ which is very close to $100$ percent for $100000$ trials. (I used 0.01% for the daily increase rather than 0.0103%)
}