Set of numbers $\ x_1, \ldots, x_m , y_1, \ldots, y_n $ where $\ x_i=0 $ for $i = 1,\ldots, m$ and $\ y_i=1 $ for $i = 1,\ldots, n$
Show that mean $M$ of this set is given by $\frac{n}{m+n}$ and the standard deviation $S$ by $\frac{ \sqrt{mn}} {m+n} $
I know the definitions of the mean and standard deviation and how to get them but Im really stuck at that question
The sample variance of observations denoted $x_i$ is $$S^2 = \frac{\sum_{i=1}^n (x_i - \bar x)^3}{n-1} = \frac{\sum_{i=1}^n x_i^2\; -\; (\sum_{i=1}^n x_i)^2/n}{n-1}.$$
The middle member of the equation is the definition and the last member is sometimes called the 'computational formula' (easily deduced from the definition). Is that the definition of $S$ you're using?
Notice that for $x_i \equiv 0,$ one has $\sum_{i-1}^m x_i = \sum_{i-1}^m x_i^2 = 0.$
Also, for $y_i \equiv 1,$ one has $\sum_{i-1}^n y_i = \sum_{i-1}^n y_i^2 = n.$
Put it all together, and you should be off to a good start--provided I have properly interpreted a somewhat vague question.