For normally distributed data, sample variance $s^2$ is:
$$s^2 = \frac1{n-1}\sum (Y_i - \bar Y)^2$$
For a bernoulli distribution, sample variance is:
$$\frac{\hat p \hat q}{n-1}$$
$\hat p = \frac{\sum Y_i}n$, where $Y_i$ is a dummy variable taking on values 0 (e.g., failure) and 1 (e.g., success). Also, $\hat q = 1 - \hat p$.
Are these variances equivalent?
I can pretty easily derive the variances and means from these distributions. However, this doesn't necessarily answer the question.
Alternatively, is there any way to directly derive one variance from the other (if they are in fact equivalent)?
Any help would be greatly appreciated.
\begin{align} \hat p \hat q = \frac{Y_1+\cdots+Y_n}n\cdot\frac{n - (Y_1+\cdots+Y_n)}n \end{align}
Let $A = \{ i\in\{1,\ldots,n\}: Y_i=1 \}$ and $B = \{ i\in\{1,\ldots,n\}: Y_i=0 \}$. Notice that the number of members of $A$ is $n\bar Y$ and the number of members of $B$ is $n(1-\bar Y)$. Then we have \begin{align} \sum_{i=1}^n (Y_i-\bar Y)^2 & = \sum_{i\in A} (1-\bar Y)^2 + \sum_{i\in B} (0-\bar Y)^2 \\[8pt] & = n\bar Y(1-\bar Y)^2 + n(1-\bar Y)(0-\bar Y)^2 \\[8pt] & = n\bar Y(1-\bar Y)\Big( (1-\bar Y) + \bar Y \Big) \\[8pt] & = n\bar Y(1-\bar Y)\cdot 1 = n\hat p\hat q. \end{align}