Let $\mathcal{S}$ be a set of real numbers. Let $\mu$, $m$, $\sigma$, and $r$ be mean, median, standard deviation, and range of $\mathcal{S}$ respectively.
Find $\mathcal{S}$ which maximizes $\dfrac{(\mu-m)r}{\sigma}$.
Let $\mathcal{S}$ be a set of real numbers. Let $\mu$, $m$, $\sigma$, and $r$ be mean, median, standard deviation, and range of $\mathcal{S}$ respectively.
Find $\mathcal{S}$ which maximizes $\dfrac{(\mu-m)r}{\sigma}$.
Copyright © 2021 JogjaFile Inc.
Alright, consider the set $S = \{-y,0,x\}$ where $x,y >0$, and $x >> y$. Notice right away that the median $m = 0$. So we're left looking at the quantity:
$$\frac{ \mu r}{\sigma}$$
Now notice that for our set, we may write: \begin{equation} \sigma = \sqrt{\frac{2}{9}x^2 + \frac{4}{9}y^2 - \frac{1}{9}xy} \end{equation}
\begin{equation} \mu r = \frac{1}{3}(x^2 - y^2) \end{equation} Fixing $y$ and allowing $x$ to vary yields: $$ \sigma = O(x)$$ $$ \mu r = O(x^2)$$
Thus we have for the set $S$, $$\lim_{x \rightarrow \infty} \frac{(\mu - m)r}{\sigma} = \infty$$
Ergo the expression is unbounded and there is no maximizing set $S$.