This is a question that arose due to another question I asked and in which I was looking for the value of $M$ for which the following equation holds:
$$ f^{(-1)}(1-\frac{d\cdot b}{M\cdot a})\cdot b\cdot e+\sqrt{a\cdot c\cdot b} = a\cdot M $$
or even simpler:
$$ a\cdot M- f^{(-1)}(1-\frac{d\cdot b}{M\cdot a})\cdot b\cdot e+\sqrt{a\cdot c\cdot b} = 0$$
Important to know was that $a, b, c, d$ and $e$ are all constants and $ f^{(-1)}(x)$ is the inverse cumulative distribution function of the normal distribution (ICDF) , where the CDF is ofcourse:
$$f(x)=\int\limits_{-\infty}^{x}\frac{e^{-t^2/2}}{\sqrt{2\pi}}\;\mathrm{d}t$$
where $t$ is the standardized value $\frac{x-\mu}{\sigma}$
As you can see, my variable $M$ is in that density function as well it is outside and how the error function is involved, leads to a numerical solution.
In the comments of this question, someone gave a very nice iteration formula:
$$M_{n+1} = \frac{1}{a} \cdot \frac{db}{1-f(\frac{aM_n-\sqrt{abc}}{be})}$$
which worked very well in converging fast enough! But when I asked how he/she got to this, I didn't get a response. So I got two questions:
How did he/she got from my equation to this solution?
Is there a general way to make an iterative equation that converges to the optimum of these kind of equations?
We can rearrange your equation to be expressed in terms of $f$ (instead of $f^{-1}$) by moving terms and applying $f$: $$1-\frac{db}{aM} = f\bigg(\frac{aM-\sqrt{abc}}{be}\bigg).$$ Notice you can rearrange this as $$M=g(M)\equiv\frac{db}{a\bigg(1-f\big(\frac{aM-\sqrt{abc}}{be}\big)\bigg)}$$ The commenter on your post used the fixed-point iteration method to find this fixed point $M=g(M)$ of $g$. Fixed-point iteration is simple: $M_{n+1}=g(M_n)$ is all it takes.
Another family of numerical methods is called "root-finding", and works as follows. Let $$h(M)=f\bigg(\frac{aM-\sqrt{abc}}{be}\bigg) - 1 + \frac{db}{aM},$$ so that your $M$ is defined as the root of $h$, i.e. the $M$ such that $$h(M)=0.$$ So, we have expressed your problem as "find the root of the function $h$." There are a lot of methods for root-finding, and the most basic and common is Newton's method. Newton's method generates a sequence $M_1,M_2,\dots$ approaching $M$ by the iterations $$M_{n+1} = M_n - \frac{h(M_n)}{h'(M_n)}.$$ Luckily in this case we know the derivative $h'$ of $h$, since $f$ was defined as an integral of the Gaussian density. Since this iteration makes use of information about the derivative, it's possible that it could converge faster. Worth a shot if the fixed-point iteration above isn't cutting it.