How to calculate sample variance?

60 Views Asked by At

I have an infinite population. Out of this population, I'm picking $N=40$ items, each of which can have a value of $0$ or $1$. Say I get a mean of $0.7$. How do I calculate the standard error of this value?

I got the formula $$\sqrt{\frac{(1-\hat{p})(\hat{p})}{N}} = \sqrt{\frac{(1-0.7)(0.7)}{40}}$$ for the variance. However, this doesn't seem to make much sense when $\hat{p}$ is $0$, because then even for a very small sample I get a variance of $0$. So, I wonder if I'm using the right formula, and if the above formula is an approximation of some sort.