Why the case $x_b < x_U$ is ruled out when $f(x_a) < f(x_b)$ in dichotomous search

39 Views Asked by At

Consider a unimodal function $f$ which is known to have a minimum in the interval $I = $ [$x_L , x_U$]. Unimodal means one minimum inside interval $I$. Let $x_a$ and $x_b$ be two points inside $I$ with $x_a < x_b$.

If $f(x_a) < f(x_b)$

Why the case that a possible minima inside the range [$x_b , x_U$] is ruled out when $f(x_a) < f(x_b)$ in dichotomous search. It seems that minima inside the range $x_b < x_U$ implies loss of unimodality(i.e., two minima in $I$). How can this be proved.

1

There are 1 best solutions below

0
On

What constitutes a unimodal function is loosely defined, so I assume the following definition:

A common definition is as follows: a function $f(x)$ is a unimodal function if for some value $m$, it is monotonically increasing for $x \leq m$ and monotonically decreasing for $x \geq m$. In that case, the maximum value of $f(x)$ is $f(m)$ and there are no other local maxima. source

The definition above describes a positive unimodal function (permits a single maximum) whereas in our case we consider a negative unimodal function (permits a single minimum). So we need to rewrite the definition to account for that:

Definition: A function $f(x)$ is a unimodal function if for some value $m$, it is monotonically decreasing for $x \leq m$ and monotonically increasing for $x \geq m$. In that case, the minimum value of $f(x)$ is $f(m)$ and there are no other local minima.

We first consider the opposite, i.e. that $f$ is unimodal on $I = [x_L, x_U]$ and that the minimum $m = \arg \min f(x)$ lies in $[x_b, x_U]$. This implies that $f(m) < f(x_b)$. But seeing that $f(x_a) < f(x_b)$ then $f$ must at some point in $[x_a, x_b]$ be increasing. This implies that $f$ cannot be monotonically decreasing for $x \leq m$. By definition $f$ cannot be unimodal and we have a contradiction.