Minimizing the change of a function when changing its parameters according to some function

126 Views Asked by At

I have a function $f$ whose parameter (I assume there is only one parameter) changes gradually from a minimum $\theta_{\min}$ to a maximum $\theta_{\max}$. I evaluate this change at discrete points. The change of the function itself is measured using some metric. Let's assume mean squared error.

I want to keep the error as constant as possible. Basically, I want the function to change as smoothly as possible when sliding along $\theta$. For this, I have to have some function that determines the sampling rate of $\theta$, which should change to keep the error constant.

I tried to do this using the derivative of the error, but couldn't do it.

To summarize, I want to build a list of $\theta$s such that when measuring the change of $f(\theta)$ when going from one $\theta$ to the next, the change is as constant (or smooth) as possible.

1

There are 1 best solutions below

3
On BEST ANSWER

Here's one approach, which assumes $f$ is continuous. Given $f, \theta_\min, \theta_\max,$ and $N,$ the number of $\theta$'s you want in your list.

  1. Determine $\Delta f:=f(\theta_\max)-f(\theta_\min).$ Assume $\Delta f\not=0.$
  2. Compute $\delta y:=\Delta f/N.$ This is the desired (constant) change in $f$ per step.
  3. Let $i=1, \theta_0=\theta_\min.$
  4. while $i\le N:$
  5. solve $f(\theta)=f(\theta_\min)+i\cdot\delta y$ for $\theta,$ and let $\theta_i$ equal the solved-for $\theta.$ I'd recommend binary search.
  6. Let $\theta_N=\theta_\max.$

If $f$ is continuous, this algorithm should produce exact results.