Numerical optimization of a generic function

133 Views Asked by At

I am new in the topic of numerical optimization and I am interested in the solution of the following problem. Consider a two-parameter function like

$$ g(\omega_1, \omega_2) $$

such that I can numerically evaluate it but I don't know the analytical dependence of $g$ with respect to the parameters.

Is there a generic method to find the global minimum of $g$ with this information?

1

There are 1 best solutions below

4
On BEST ANSWER

I am assuming you would like to find a minimum or a maximum of $g$ over some set in $\Omega_1 \times \Omega_2$, possibly the entire $\mathbb{R}^2$.

Assuming $g$ is nice (e.g., continuous) you can use conjugate direction methods, like Powell's Method. If $g$ is differentiable, there are better methods using the gradient if you can numerically compute it.

Here is a nice discussion on Gradient Descent method with some examples.