Calculating the gradient without knowing the function

3.9k Views Asked by At

I have to develop an optimizer for a simulation. There are 7 reference values $$ r_1, r_2,\ldots,r_7 $$ (certain values which are expected to show up) and 7 corresponding actual values $$ a_1,a_2,\ldots,a_7,$$ the simulations's results. The deviations between reference values and actual values are listed as one single value, the sum of squares of all deviations. $$ f(x)=\sum_{i=1}^{7}(r_i - a_i)^2=\sum_{i=1}^{7}e^2=||e||^2 $$ where $$x_1,x_2,\ldots,x_7 $$ denote the input values.

Given this, I can calculate an arbitrary number of results $f(x)$ for many input vectors $x$, while the results are scalar values. Obviously I want to minimize the deviations. My minimization method makes use of the gradient. How can I calculate the partial derivatives to build up the gradient without knowing the function?

I only know the input values and the result, but the relationships between input values is completely unknown!

1

There are 1 best solutions below

3
On

You can assume:

$$\nabla f(\vec x) = (\frac{\partial f}{\partial x_1},\cdots, \frac{\partial f}{\partial x_7}) \approx (\frac{f(x_1+h)-f(x_1)}{h},\cdots,\frac{f(x_7+h)-f(x_7)}{h})$$

$h$ needs not to be a vector. Then, you can use the information given by the gradient to select the variable(s) to change. For example, adopting a steepest descent you may apply a change proportional to the negative of the (approximate) gradient of the function at the current point (see this Wikipedia entry, for example).