I would like to solve a function of the form $a^x = bx + c$ for $x$, but I read it can not be solved algebraically (transcendental equation).
My attempts basically fall flat at $\log_a(a^x - bx) = \log_a(c)$ as there is no way to simplify $\log_a(a^x - bx)$.
Although I did read that $$\log(s + t) = \log(s) + \log\left(1 + \frac{t}{s}\right) \approx \log(\max(s, t))$$
So I can simplify $x \approx \log_a(a^x - bx) = \log_a(c)$, if $a^x \gg -bx$.
I tested this numerically and the results seem plausible, but I am unsure how to test if $a^x + bx$ is really large enough for the approximation to be accurate. How would I go about calculating the error to keep it below, say 1%?
We can use Newton's method to compute the root of $f(x) = a^x - bx - c$. Newton's method is the following algorithm:
$$ x_{n + 1} = x_n - \frac{f(x_n)}{f'(x_n)} $$
Where an initial estimate $x_0$ of the root is repeatedly improved. Substituting the relation for $f(x)$:
$$ x_{n + 1} = x_n - \frac{a^{x_n} - bx_n - c}{a^{x_n} \ln a - b} $$
We repeatedly apply the above to get better and better estimates for the root.