I have two functions $f, g: \mathbb{R} \rightarrow \mathbb{R}$ and I know their definition (i.e. I can differentiate them, integrate them, do whatever I wish to do provided it is possible). Then, let there be a "wrapper" function which wraps a $\mathbb{R} \rightarrow \mathbb{R}$ function like this: $w_h(x) = h(a_hx + b_h)$ where $a_h, b_h \in \mathbb{R}$ are parameters.
Now, I have a function $w_f$ (i.e. a wrapper around $f$) and I know the parameters $a_f$ and $b_f$. I would like to find such parameters $a_g$ and $b_g$ for $w_g$ so that it is as close as possible to $w_f$. Is there a general procedure that might accomplish this in general?
Simple, non-general example
$f(x) = 2x$, thus $w_f(x) = 2(a_fx + b_f)$. Let $g(x) = x$. Then the optimal parameters are $a_g = 2a_f$ and $b_g = 2b_f$ as it results to $w_g(x) = g(a_gx + b_g) = a_gx + b_g = 2a_fx + 2b_f = 2(a_fx + b_f)$
Least squares fitting should work.
Take some number of points, $x_1, \ldots, x_n$. You'll have to choose these depending on the complexity of your functions. Then the error between $w_f$ and $w_g$ can be measured by $$ e(a_g,b_g) = \sum_{i=1}^n (w_f(x_1) - w_g(x_i))^2 $$ Find the partial derivatives with respect to $a_g$ and $b_g$, and set these to zero to find a minimum point.
Alternatively, use a standard optimization package to minimise $e(a_g,b_g)$.