Let $x_1, \ldots, x_n$ an simple random sample of a Bernoulli distribution of parameter p, $0<p<1$. Let $\overline{\textbf{x}}$ be the mean sample and $S^2=\frac{1}{n-1}\sum_{i_1}^n (x_i-\overline{x})^2$. Calculate the probability function of $\overline{\textbf{x}}$ and $S^2$
I use the moment-generating function, because that determines univocally the probability function.
$$M_{\overline{\textbf{x}}}(t)=E(e^{t\overline{\textbf{x}}})=E(e^{t\frac{1}{n}\sum_{i=1}^n x_i})=E(e^{tx_1/n + \ldots + tx_n/n})=[s.r.s]=$$ $$=\prod_{i=1}^n E(e^{\frac{tx_i}{n}})=\prod_{i=1}^n[(1- p+ p e^{t/n}]=[(1-p)+ pe^{t/n}]^n=M_Y(t/n)$$
But it is not like any m.g.f of a known probability function. I also thought that:
$U \longrightarrow M_Y(t)$
$aU+b \longrightarrow e^{bt}M_Y(at)$
then $\overline{\textbf{x}}\thicksim \frac{Bi(n,p)}{n}$
Or something like that; first I don't know if that "is legal" and second I don't know what is $\frac{Bi(n,p)}{n}$ and if it is a probability function.
PS: sorry about my English.
$Outline:$
Sample mean. If $n$ independent $X_i$ are $Bernoulli(p)$, then $T = \sum_{i=1}^n X_i \sim Binom(n, p).$ From which it is easy to find the PDF of $\bar X = T/n.$
Sample Variance. Notice that $$S^2 = \frac{1}{n-1}\left(\sum_{i=1}^n X_i^2 - n\bar X^2 \right).$$ Also, for Bernoulli random variables, which take only values $0$ and $1$, we have
$$\sum_{i=1}^n X_i = \sum_{i=1}^n X_i^2,$$
and that ought to simplify the rest of it.