How can we interpret the difference between two log points? Is it correct to interpret this difference in percentage points?
Thanks.
Marko
How can we interpret the difference between two log points? Is it correct to interpret this difference in percentage points?
Thanks.
Marko
On
Just to elaborate some on what Spencer wrote:
$\text{log}(x)-\text{log}(y)=\text{log}(x/y)=\text{log}(\frac{x-y}{y}+1)=\text{log}(\frac{\%\Delta}{100}+1) $
where $\%\Delta$ is the percentage change. A first order Taylor approximation around $\%\Delta=0$ gives
$\text{log}(x)-\text{log}(y)=\text{log}(\frac{\%\Delta}{100}+1) \approx \frac{\%\Delta}{100}$
The exact percentage change is of course given by
$100(\text{e}^{\text{log}(x)-\text{log}(y)}-1)$
$$\log(x)-\log(y) = \log(x/y)$$
If $x$ differs from $y$ by a factor of $1+\epsilon$ then we have,
$$\log(x)-\log(y) = \log(x/y) = \log(1+\epsilon) \approx \epsilon, $$
The last approximation is only valid for small values of $\epsilon$. Since expsilon can be interpreted as a percent difference I would say that the comparison of the difference of logarithms to a percentage is is only sensible if the difference is small.
I think a more reasonable interpretation is just to think back to the definition of logarithms are. The difference of $\log(x)$ and $\log(y)$ just represents the power of $e$ by which $x$ must be multiplied to equal $y$.