Recently, I've been reinvigorated to get a better intuition for probability and statistics by its several simultaneously vexing and intriguing paradoxes. One of which I've discovered through my interest in programming.
I have a program that will generate a random number between 0 and 1 over and over again until it is greater than some number N, counting the number of times it does this. It then keeps a list of each possible number of tries it could have taken and counts them up. In essence, it's repeating random chance 1-N until it gets the result and telling me how many times the program was able to get it in X attempts.
Now, originally I was expecting a bell curve around the number 1/p where p would be 1-N, or just the probability. However, I was completely surprised to find out that the most common result was only 1, and it trails off in probability from there! The average number of attempts is still 100, I suppose, but it's simply baffling to me that the graph comes out this way. For an example of why it seems paradoxical to me, take the case of N = 0.99, so P = 0.01 or 1 in 100. It will be more likely that you have a result before fifty attempts than it will be that you have a result between 75 to 125 attempts.
I don't know if anyone here is able to shed light on the topic, as it may just come down to poor intuition, but I felt like it was striking enough to post a question regarding it. Below I will post a graph of my results from 10,000,000 trials where X is the number of attempts before getting the result, and Y is how many times it took exactly that number of attempts.

Your $N$ is presumably in $[0,1)$, i.e. $0 \le N \lt 1$, making $p=1-N$ be the probability of exceeding $N$, so $0 \lt p \le 1$
What you have in $X$ is a geometrically distributed random variable as the number of attempts until you exceed $N$. The probability mass function for $X$ is $$\mathbb P(X=x) = (1-p)^{x-1}p$$ where $x \in \{1,2,3,\ldots\}$ since you have to initially fail $x-1$ times and then succeed $1$ time. You get the sort of exponentially decreasing curve you illustrated from your simulation. You can extend this to the cumulative distribution function $$\mathbb P(X\le x) = 1- (1-p)^{x}$$
Your thought that it will be more likely that you have a result before $50$ attempts than it will be that you have a result between $75$ and $125$ attempts is basically correct since the latter involves failing $75$ times and then succeeding in the next $49$ attempts and that is $$\mathbb P(75 \lt X \lt 125) = \mathbb P(X \le 124) - \mathbb P(X \le 75) \\=(1- (1-p)^{124})-(1-(1-p)^{75})= (1-p)^{75}-(1-p)^{124} = (1-p)^{75}(1-(1-p)^{49}) \\= \mathbb P(X\ge 75) \mathbb P(0 \lt X \lt 50) \\ \lt \mathbb P(0 \lt X \lt 50)$$ i.e. less than just having to to succeed in the first $49$ attempts, and I find this an intuitively sensible result.