Assume that $i$ from $1,\ldots,N$, $x_i \ge 0$ and:
$$\mathrm{avg} = \frac{\sum_i x_i}{N}$$
$$\sigma = \sqrt\frac{\sum_i{(x_i-\mathrm{avg})^2}}{N}$$
Is that true that:
$$\max_i x_i \ge \mathrm{avg} + \sigma\text{ ??}$$
REFERENCE
Assume that $i$ from $1,\ldots,N$, $x_i \ge 0$ and:
$$\mathrm{avg} = \frac{\sum_i x_i}{N}$$
$$\sigma = \sqrt\frac{\sum_i{(x_i-\mathrm{avg})^2}}{N}$$
Is that true that:
$$\max_i x_i \ge \mathrm{avg} + \sigma\text{ ??}$$
REFERENCE
No. In fact it's not even true when you replace sigma by the square root of sigma i.e. the standard deviation rather than the variance.
As an example, consider the set of numbers {0,6,6}
Mean: 4
Variance:8
Standard Deviation: 2.83