My question is more practical than theoretical. I'm trying to simulate a natural process by generating events randomly. Let's say that I know some particular event happens following a normal distribution with a known mean and standard deviation.
For example, let's say I know on average I have 100 visitors on my webpage with stddev of 30. Now I want to generate requests to simulate such traffic. As far as I can remember, generating samples for such a process needs to follow the Poisson distribution. In other words, by generating a Poission random variable, I can tell when the next request should be sent. And once I do it that way, the average number of requests I've generated in a second will follow the initial normal distribution I was looking for, as long as:
Poisson-Lambda = 1 / Gaussian-Mean
My problem is that I don't remember how should I include the standard deviation of Gaussian in this mix! So, here's my question in one sentence:
How can I come up with the time between my events when I have the Gaussian distribution parameters for how many events happen in a second?