I was trying to understand QUAKE III fast inverse square root alg and i want to find best 'u' value in $\log_2(x+1)≈x+u$ approximation

102 Views Asked by At

I was trying to find best 'u' value for this approximation:

$\log_2(x+1)≈x+u$

And I did think I can calculate error with this function.
NOTE: for the x values between 0 and 1 i need becouse of IEEE 754 float point uses a scientific notation that (1 + x) * 2^n that 0 <= x < 1

$f(u)=\int_{0}^{1}|\log_2(x+1)-(x+u)|dx$

Then I wanted to find when the slope becomes zero. (means local min in this situation)

$\frac{d}{du}f(u)=0$
$\frac{d}{du}(\int_{0}^{1}|\log_2(x+1)-(x+u)|dx)=0$

Then I get stuck what can I even do after that?

func

graph

2

There are 2 best solutions below

0
On BEST ANSWER

Firstly I am answering my own question after 2 days because I found a very cool solution for finding the best 'u' value in this approximation:

$log_2(1+x)≈x+u$

My solution starts with finding the error value with an integral

$f_{error}(x,u)=log_2(1+x)-(x+u)$
$f_{totalErr}(u)=∫_{0}^{1}|f_{error}(x,u)|dx$ (Total Error Between 0 and 1)

Then I split the integral into 2 parts because taking absolute value of error creates a non-continuous function

$f_{tErrPlus}(u, s, E)=∫_{s}^{E}f_{error}(x,u)dx$ {if $f_{error}(x,u) ≥ 0$}
$f_{tErrMinus}(u, s, E)=-f_{tErrPlus}(u, s, E)$ {if $f_{error}(x,u) ≤ 0$}

In this part, we will need to cheat a little and use the graphs. :D

Desmos

We see that our error function has two roots which means we have 3 parts
2 negative and 1 positive part we can see that

if $(x_0, x_1)$ are roots
$f_{totalErr}(u) = f_{tErrMinus}(u, 0, x_0) + f_{tErrPlus}(u, x_0, x_1) + f_{tErrMinus}(u, x_1, 1)$

I tried to find the roots of our function, but it only can be calculated with Lambert W function which I can't find how to calculate.
But then I remembered our function is perfect for using Newton's Method For using Newton's Method we need to find the derivative of our error function.

$f_{ddx}(x) = \frac{d}{dx}f_{error}(x, u) = \frac1{ln(2)(x+1)}-1$

We can start our newton approximation from 0 for $x_0$ and 1 for $x_1$

Proof: (You can skip this part)

We find $u > 0$ if we want best approximation

and

When $f_{ddx}(x)=0$ it always stays between 0 and 1 and 'u' value doesn't change anything because it is constant and in derivative constants don't do anything.

End of Proof

Let's use Newton's Method:

$g_{next}(x, u) = x - \frac{f_{error}(x, u)}{f_{ddx}(x)}$
$x_0 = (g∘g∘g...)(0, u)$
$x_1 = (g∘g∘g...)(1, u)$

We need to take integral of $∫_{0}^{E}f_{error}(x,u)dx$ which $f_{tErrPlus}(u,0,E)$
I will skip how we take integral because I think everybody knows.

$f_{tErrPlus}(u, 0, E) = \frac{(E + 1)ln(E + 1) - E}{ln(2)}- \frac{E^2}2-ue$
$f_{tErrPlus}(u, s, E) = f_{tErrPlus}(u, 0, E) - f_{tErrPlus}(u, 0, s)$

And after that, it is just a calculation

$\frac{d}{du}(f_{tErrMinus}(u, 0, x_0) + f_{tErrPlus}(u, x_0, x_1) + f_{tErrMinus}(u, x_1, 1)) = 0$
or
$\frac{d}{du}(f_{tErrMinus}(u, 0, 1) + 2 * f_{tErrPlus}(u, x_0, x_1)) = 0$

AND when we calculate all of that (I did) it gives us 0.0644 is the best value for 'u'.

$u=0.0644$

5
On

$\log_2(x+1)≈x+u$

So this is

$$ f(x)=\log_2(1+x) -x\approx u \tag1 $$ for $x\in[0,1]$. We have $f(0)=f(1)=0$ and $f(x)\geqslant0$ over the interval which means that for the image of $f$ we have: $f([0,1]) = [0,v]$.

The (local) maximum of $f$ is given by $$f'(x_0)=\frac1{(1+x_0)\ln2}-1 \stackrel{.}= 0\tag2 $$ which has the single solution $$ x_0 = \frac1{\ln2}-1\tag3 $$ so that $$v=f(x_0) \approx 0.08607 $$ Choosing $u$ so that the maximal absolute error is minimal is achieved by $u=v/2$: $$ u = \frac{\ln\left(1/\ln2\right)-1}{2\ln2}+\frac12 \approx 0.04304\tag4 $$