There is a fairly standard technique of removing outliers from a sample by using standard deviation.
Specifically, the technique is - remove from the sample dataset any points that lie 1(or 2, or 3) standard deviations (the usual unbiased stdev) away from the sample's mean. Is it possible with this technique that one ends up removing all points from the dataset. Or, is there a property of sample stdev that prevents this from happening?
Duh. Its quite simple to show that there is atleast one datapoint from a sample lying within one stddev from the mean.
Proof -
Assume all datapoints are more than one stddev away from the mean. That is -
$|x_i-\mu| > \sigma$, for all $1 \le i \le n$
Then we have,
$\sum\limits_{i=1}^n(x_i-\mu)^2 > n\sigma^2$
which is in contradiction with the definition of $\sigma$ (sample standard deviation).
$(n-1)\sigma^2 = \sum\limits_{i=1}^n(x_i-\mu)^2$