How does expected minimum of a set of random variables decreases as the variance of the random variables increases?

29 Views Asked by At

I was reading a paper on Addressing function approximation Error in Actor critic methods where they explain a benefit of their proposed method as follows:

enter image description here

I wish to build an intuition of this effect with either a proof or an example. Thank you

1

There are 1 best solutions below

0
On

As an example, think of two distributions (RVs) with the same mean. If distribution 1 has a larger variance than distribution 2, the data will be more spread out. Distribution 1 will likely have a smaller minimum and a larger maximum than distribution 2 because its values tend to be further from the mean.