I have a question about what does $\log10$ error in absolute and relative sense represent. Let $y$ be true value and $\hat{y}$ by approximation value of $y$. Then:
- Absolute (Forward) Error: $\|\hat{y}-y\|$
- Relative (Forward) Error: $\frac{\|\hat{y}-y\|}{\|y\|}$
So am I right in saying that:
- $ - \log_{10}(\|\hat{y}-y\|)$ measures the number of digits to which $\hat{y}$ is accurate to $y$.
- $ - \log_{10}(\frac{\|\hat{y}-y\|}{\|y\|})$ measures the number of decimal points to which $\hat{y}$ is accurate to $y$.
The reason I say the latter is number of decimals is because $$ - \log_{10}(\frac{\|\hat{y}-y\|}{\|y\|}) = \log_{10}(\|y\|) - \log_{10}(\|\hat{y}-y\|)$$ so you get rid off the magnitude of $y$ which is above the decimal point.
Am I right in making this assumption? I am aware of the fact that above quantities are not integers, so we need to round down.