Is it normal for the standard deviation to have fluctuations or does it always have to decrease as the number of samples increases?
I'm running an experiment and analyzing the standard deviation. It turns out that with 30 samples it is showing a standard deviation smaller than with 100 samples. However, the difference is small. This is normal? Whenever the number of samples is increased, does the standard deviation have to decrease?
example
43,39343268 - 100 samples
41,70797952 - 30 samples
43,22036557 - 50 samples
I'm calculating the profit made in 600 minutes. The profit is the result of several distributions, as the arrival time of each customer can vary, which type of customer and processing time, with each type of customer paying a different amount... therefore for every 600 minutes (one day of work), the profit will be different.
example profit 10 runs:
590 610 560 620 620 510 630 610 620 600