Any applications for minimizing the Neural Network functions?

220 Views Asked by At

I wonder if there are any theoretical/practical (paper) applications for minimizing a Neural Network function?

Precisely, let $f(w,x)$ be a real-valued neural network function obtained by determining its weights. Now I am interested in minimizing $f(w,x)$ over all $x$ in some space! Is there any application for this minimization?

1

There are 1 best solutions below

4
On BEST ANSWER

Suppose you want to minimize a function $F(\mathbf{x})$. The function may be very expensive to evaluate (e.g., a computer simulation model which takes a couple of hours to complete), and/or its functional form and/or its gradients may not be available. In that case, you may evaluate the function at $N$ points and collect the pairs $(\mathbf{x}_{1}, y_{1})$,...,$(\mathbf{x}_{N}, y_{N})$, where $y_{i}:=F(\mathbf{x}_{i})$. Using the data pairs $(\mathbf{x}_{i}, y_{i}), i=1,..,N$, you may train a neural network $f$ and find a mapping $f(\mathbf{x}_{i} \mid \mathbf{w}) = y_{i} + \epsilon$. Now, if you minimize $f(\mathbf{x} \mid \mathbf{w})$, you can take $\mathbf{x}^{*} = \arg \min_{\mathbf{x}} f(\mathbf{x} \mid \mathbf{w})$ as your next evaluation point (i.e., $N+1^{th}$ point) for $F$. Then, with the observed $y_{N+1}$, you may re-train the neural network and repeat the process.

The idea is that the neural network $f$ will be a surrogate of $F$, and will assist you through finding $\min_{\mathbf{x}} F(\mathbf{x})$. This approach is known as Surrogate Optimization. Some of the commonly used surrogate models are RBF interpolant (see e.g., Regis and Shoemaker, 2007), Gaussian Process (e.g., Bayesian Optimization), Decision Tree, NN, SVM.