Charting Histogram Averages and Standard Deviation

897 Views Asked by At

Pretty basic question I'm sure:

I have a 100 histograms, they are all created from randomly sampling the same data 100 times.

Now, I want to show the 'average histogram' with markers at +1 standard deviation.

I know how to compute the average - just add bin-wise and divide by the number of histograms.

But for the standard deviation: do I use all the bin-heights from all 100 histograms to calculate the standard deviation, or do I use the bin-heights from my 'average histogram' to calculate it?

thanks.

1

There are 1 best solutions below

2
On BEST ANSWER

The (sample) standard deviation is defined as $\hat{\sigma} =\sqrt{\overline{x^2}-\overline{x}^2}$. Normally you would calculate the average and standard deviation for each bin. For each bin, $\overline{x}$ is the value in that bin in the average histogram, while $\overline{x^2}$ is the average of the squares of the values in that bin over all the histograms.