Optimization to find the change in input parameters that best explains a change in output parameters

26 Views Asked by At

I have trouble formulating my problem in an understandable way. I want to find a difference in a simulated input parameter vector $\vec{p}_{simulation}$ which best explains a given difference of an measured output parameter vector $\vec{y}_{measurement}$. The function linking these two vectors is achieved with a numerical model, which I assume to be reasonable enough for the relevant effects. I normalize the $\vec{y}$-vectors to make them comparable. This is how I would write the optimization goal, perhaps someone could help me with the notation:

$\Delta\vec{p}_{simulation} = min(||\Delta\vec{y}_{measurement}||_{max}-||\Delta\vec{y}_{simulation}||_{max})$

Does this kind of problem have a name?

1

There are 1 best solutions below

2
On BEST ANSWER

So you have a numerical model $\hat{y}(p)$ with some adjustable parameters $p$ and you want to find the vector of parameter values that get us as close as possible to the measured value $y$.

This is a very common problem in statistics and machine learning where you are setting a model's parameters based on how close the output gets to a target value.

In your case, you are trying to get the output $\hat{y}$ "close" to $y$ by adjusting $p$. However, it depends on how you define "close":

  1. Absolute difference: $|\hat{y}(p)-y|$
  2. Squared difference $(\hat{y}(p)-y)^2$

In fact, any symmetric, non-negative function $f(d)$ for a difference $d$ can serve as your definition of close that you want to minimize.

Your post is closest to (1) (you needed to include the absolute value so it doesn't go to $-\infty$).

How easy/hard this is depends on how convex your model is (i.e., does its graph resemble something like a parabola?) so it has a only one "hill" to climb (or bowl to descend here)? If so you'll be able to use a local numerical optimizer.

If you have many hills/bowls then you'll have to try with many starting points and hope you get convergence.