Shortest Confidence Interval of Unimodal Distribution

142 Views Asked by At

Currently I am taking a statistical course. The lecturer proposed a proposition about the shortest confidence interval of unimodal distribution:

assume distribution $f$ is unimodal, if $\int_{a^*}^{b^*}f(y) dy = \alpha_0$ in $[a^*, b^*]$ and $f(a^*) = f(b^*)$. Then $\forall [a, b]$ with $\int_{a}^{b}f(y) dy = \alpha_0$ we have $b^* -a^* \leq b-a$.

The lecturer says that the proof is very easy, although I dont have any idea about it.

I think it is intuitive for the symmetrical unimodal case, but I dont know how to generalize it to all the unimodal distribution.

Could anyone help me? Thanks in advance!

P.S. I dont have background of measure theory.

1

There are 1 best solutions below

0
On BEST ANSWER

Suppose the pair $(a, b)$, with $b > a$, satisfy the equation (*)

$$ F(b) - F(a) = \alpha_0 \tag{*} $$

where $F$ is the CDF. Note that $a, b$ are implicitly related due to this constraint. We want to find out which pair of $(a, b)$ will minimize the distance $b - a$

Differentiate both sides with respect to $a$, we have

$$ f(b) \frac {\partial b} {\partial a} - f(a) = 0 \Rightarrow \frac {\partial b} {\partial a} = \frac {f(a)} {f(b)}$$

Therefore,

$$ \frac {\partial} {\partial a} (b - a) = \frac {f(a)} {f(b)} - 1 \begin{cases} < 0 & \text{when}& f(a) < f(b) \\ = 0 & \text{when}& f(a) = f(b) \\ > 0 & \text{when}& f(a) > f(b) \\ \end{cases} \tag{**}$$

Denote $m$ be the mode. Then we have $f(y)$ is strictly increasing when $y < m$ and strictly decreasing when $y > m$. That is, when $a < b < m$, we have $f(a) < f(b) < f(m)$ and when $m < a < b$, we have $f(m) > f(a) > f(b)$. So if $f(a^*) = f(b^*)$, we have $a^* < m < b^*$.

On the other hand, as $(a^*, b^*)$ also satisfy $(*)$, so for any pair of $(a, b)$ satisfying $(*)$, we have

$$ F(b) - F(a) = \alpha_0 = F(b^*) - F(a^*) \Rightarrow F(b^*) - F(b) = F(a^*) - F(a) $$

Therefore $F(b^*) - F(b)$ and $F(a^*) - F(a)$ must of the same sign. As the CDF $F$ is monotonic, we have $\{a < a^* \text{ and } b < b^*\}$, or $\{a > a^* \text{ and } b > b^*\}$

So if $a < a^*$, we have $b < b^*$, i.e. $b < m$ or $m \leq b < b^*$. If $m \leq b < b^*$, we have $$f(b) > f(b^*) = f(a^*) > f(a) $$ where the inequality is due to $f$ is unimodal. And if $m > b > a$, we have $f(b) > f(a)$ as $f$ is unimodal. So in any case, $f(b) > f(a)$ when $a < a^*$. Similarly, we can show that $f(a) > f(b)$ when $a > a^*$. So the equation $(**)$ can be translated as

$$ \frac {\partial} {\partial a} (b - a) = \frac {f(a)} {f(b)} - 1 \begin{cases} < 0 & \text{when}& a < a^* \\ = 0 & \text{when}& a = a^* \\ > 0 & \text{when}& a > a^* \\ \end{cases} \tag{**}$$

which concludes that $(a^*, b^*)$ is the unique minimizer of the distance $b - a$