Context:
Hi, I started by calculating the chance of an event happening with a chance of 1 in x after n times. Next I wanted to flip this approach to be able to say how many n attempts are needed to have a given chance P(n) of an event happening.
Examples:
- With a chance of 1 in 4.2 billion km, how many kilometers does one have to drive to have a chance of at least 90% of being in a fatal car crash
- With a chance of 1 in 140 million, how often does one have to play the lottery to have a chance of at least 50% of winning.
My current approach:
- I came up with the following formula: $P(n) = 1-(1-1/x)^n$ to calculate the chance of an event happening after n times with a given chance 1 in x
- Next I solved for $n$: $n = \dfrac{\log(1 - P(n))}{\log(1 - 1/x)}.$
Results:
For my 1. approach I expected that if having $n = x$ the result is always the same $P(n)$ and the result would be $1$ standard deviation. The results are very close, but not always the same though and a little off from the expected $1\sigma$. Were my expectations wrong, if so, why? Or possibly my formula wrong in the first place?
As per my 2. approach, if my first one is already wrong, it is wrong too of course. I felt like the resulting formula is too complex and doesn't have to be.
Both of my results are very close to what I calculated manually using an iterative approach, so I think I might not be too far off.
I appreciate your comments and as I have never asked a question om math stackexchange and have a programming background, I'd also appreciate any other slightly off-topic advice on how I did.