Choosing a cost function for optimization

65 Views Asked by At

Suppose I have an independent variable $x\in[0, 1]$ and a function $f(x):[0,1]\rightarrow[0,1]$ such that

  • $f(x) = 1 \Leftrightarrow x = 0$,
  • $f(x) = 0 \Leftrightarrow x = 1$, and
  • $f$ is a decreasing function of $x$,

and that both $x$ and $f(x)$ need to be as small as possible in some as-yet ill-defined sense. I could define a cost function, but the most obvious choice, $g(x): = xf(x)$, will not work because it is minimized exactly when one variable is minimized and the other is maximized. On the other hand, a function like $h(x):=x + f(x)$ could work.

But the bigger problem remains: I only found $h(x)$ by intuition + guess-and-check. How can I methodically come up with a good cost function? For example, suppose I have one or more of the following additional requirements:

  1. the value of $x$ is, in practice, much closer to 1 than to 0;
  2. I want to allow, but "penalize," values of $x$ that are too close to 1; or
  3. I know that minimizing $f(x)$ is roughly three times as important as minimizing $x$.

Is there some theory on this subject that would be accessible to a math graduate student whose studies have touched very little on optimization?

1

There are 1 best solutions below

0
On

I have an idea.

Let's say that all three requirements apply. Then $-\log(1-x)$ is an increasing function of $x$, and both it and its derivative increase without bound as $x\rightarrow 1^-$. So it's a sensible preliminary cost function to penalize values of $x$ that are too high. The same can be said for $f(x)$, so let's set our cost function to be $C(x) = -\log(1-x) - 3\log(1-f(x))$. For the moment, think of $f(x)$ as a second independent variable $y$; then $y\rightarrow y + h$ affects the cost function three times as much as $x\rightarrow x + h$. Thus $C(x)$ is a good start, although now we need to remove the assumption of independence.