I am certainly new to statistics. I did some simulations and got a lot of data. From the data I ran a AWK script to calculate the average $\bar x$; minimum, $x_0$ and standard deviation, $\sigma$ (the one where you divide by $N$, not $N-1$).
Now I want to plot the data. I guess, I can draw the histogram $\bar x$ high but I am confused how long my error bar should be, like should it be,
- one standard deviation long (68% confidence)
- or $2\sigma$ (95% confidence) or $3\sigma$ (99.7% confidence) long.
- or should I draw it from min-value to max-value
Error bars often represent one standard deviation of uncertainty, one standard error, or a particular confidence interval (e.g., a 95% interval). -Wikipedia
When you are talking about plotting standard deviation, are you sure that the data being plotted is not skewed? If it is, be careful enough to calculate true standard deviations before plotting them.