Find the roots of a function with logarithms (possibly using lambert W function)

296 Views Asked by At

I am wondering if anyone can help me find an analytical solution to the roots of the following function: $$f(b) = c\log \left( \frac{b}{a} \right) + (n-c)\log \left( \frac{1-b}{1-a} \right),$$ $a,b \in (0,1)$ and $n,c \in \mathbb{Z}$ with $c \le n$.

I have done some googling and I found references to using lambert's w function for finding roots in situations where logs are involved; unfortunately, nothing I've read seems like it will solve my problem. I am admittedly completely new to lambert's w function, so it's very possible that I am missing something.

I have proven $f(b)$ is concave and only has two roots (graphical support for this claim). Clearly one root is $b=a$, the other root is my challenge. The context of the actual problem--which I can expound upon if it would be helpful--leads me to believe that this non-trivial root (r) should be a function of $t = \frac{c}{n}$, i.e., $r = h(t)$.

I am open to any ideas or suggestions, even if they do not exactly solve the problem, because they may provide analytical approximations or bounds on $h$ and/or $h^{-1}$, which is my ultimate goal.

Update

@Claude's suggestion to taylor expand moves us forward because we can get a 2-degree polynomial approximation: $$f(b) = f(b_{*}) + \frac{f''(b_{*})(b-b_{*})^2}{2}+R_2(b),$$ where $R_2(b)$ is the remainder/error. We can then use the quadratic equation to find the roots of this 2-degree polynomial, but can we get a bound on $R_2(b)$.

Why Bound The Remainder (Context of My Problem)

So $f(b)$ is a concave function, with 2 roots: trivially when $b = a$ and another non-trivial root that I will call $b_{\max}$. The function $f$ is maximized at $b_{*} = \frac{c}{n}$, i.e., $b_{*} = \arg\max_b f(b)$. I originally wanted to find $b_{\max}$ because I believe there is some other function $h(b_{*}) = b_{\max}$. I believe from this function I should be able to derive an inequality in the vein of $b_{*} \le \frac{b_{\max}}{q}$, where $q$ is a real number or possibly a function of other variables. Intuitively, $b_{\max}$ cannot be "too close" to $b_{*}$.

By doing the taylor expansion approximation as suggested, we will get an approximation of $f$ (called $f_{\text{aprx}}$). From $f_{\text{aprx}}$, we can get an approximation for $b_{\max}$, (called $b_{\text{aprx}}$). However, we need to bound how far away $b_{\text{aprx}}$ is from $b_{\max}$. This is because once we compute our approximation of $h$ (called $h_{\text{aprx}}$)--from deriving the relationship between $b_{*}$ and $b_{\text{aprx}}$ in $f_{\text{aprx}}$--we then compute $b_{*} \le \frac{b_{\text{aprx}}}{q_\text{aprx}}$. However, I need to get a bound on how far away $q_{\text{aprx}}$ if from $q$, which means I need a bound on how far away $b_{\text{aprx}}$ is from $b_{\max}$.

2

There are 2 best solutions below

1
On

I am not sure that you can have an explicit solution and I am afraid that numerical methods should be required.

We have $$f(b) = c\log \left( \frac{b}{a} \right) + (n-c)\log \left( \frac{1-b}{1-a} \right)$$ $$f'(b)=\frac{c-b n}{b(1-b)}$$ $$f''(b)=-\frac{nb^2 -2 cb +c}{(b-1)^2 b^2}$$ Since $c<n$, $f''(b)$ is always negative (as you prooved it). The first derivative cancels at $b_*=\frac cn$.

What I should do is to expand $f(b)$ around $b_*$ and make the approximation $$0=\left(c \log \left(\frac{c}{a n}\right)+(n-c) \log \left(\frac{n-c}{(1-a) n}\right)\right)+\frac{n^3 }{2 c (c-n)}\left(b-\frac{c}{n}\right)^2+O\left(\left(b-\frac{c}{n}\right)^3\right)$$ Solve for $b$ (retaining the largest root. From this point, start Newton method.

As you suggested, the equation simplifies if you set $n=tc$ from the beginning and the approximate solution just depends on $a$ and $t$.

Applied to the case you gave $(a=0.1, n=10, c=4)$, the pproximation gives an estimate $\approx 0.786516$ for a solution which is $\approx 0.768991$.

1
On

I prefer to add another answer to this problem rather than to edit the previous one.

Setting $n=tc$ in the previous equation, we then have $$f(b)=\log \left(\frac{b}{a}\right)+(t-1) \log \left(\frac{1-b}{1-a}\right)$$ $$f'(b)=\frac{1-b t}{b(1-b)}$$ $$f''(b)=-\frac{tb^2 -2 b+1}{b^2(1-b)^2}$$ The maximum of the function occurs at $b=\frac 1t$ and the second derivtive is always negative since $t>1$.

Now, let us define a second point $b_*$ such that $$f'(b_*)=-f'(a)$$ $b_*$ is the solution of the quadratic $$ (a t-1)b^2+ \left(a^2 t-2 a t+1\right)b+a(1-a) =0$$ and the root $b_*$ must selected such that $b_*>\frac 1t$.

Now, apply one iteration of Newton or Halley or Householder methods starting at $b_0=b_*$. This should give a good estimate of the solution; if, by chance, $f(b_*)<0$, the first iteration of Newton method would provide an upper bound of the solution (Darboux theorem).

Using the values you gave $a=0.1$, $t=2.5$ the first iterate of the three methods mentioned above are $ 0.784510$, $0.772058$ and $0.769675$ while, as said earlier, the solution is $0.768991$