I am not a mathematician but have some theoretical question.
To refer to probability, we can use either decimals (e.g., 0.1) or a percentage (e.g., 10%), because % means "multiply with 1/100", right? (And probability here does not necessarily mean the frequency but also includes a scale of subjective confidence as in Bayesian statistics).
If so, we should be able to say both:
- "The probability increased from 0.1 to 0.15, or by 0.05."
- "The probability increased from 10% to 15%, or by 5%."
and these two sentences should be interchangeable.
However, in the standard math explanation, the second sentence should be written as:
- "The probability increased from 10% to 15%, or by 5 percentage points."
because % is a relative measure so that an absolute difference should be expressed as a percentage point. But in the decimal format, none of the three numbers (0.1, 0.15, and 0.05) needs not be distinguished from one another by different units (right?). Then, it seems inconsistent to apply two different units (% and percentage points) once we multiply these decimals by 100.
The standard math explanation also says that if we use a % unit to describe the difference, we should write:
- "The probability increased from 10% to 15%, or by 50%."
because % is a relative measure. But then, if % simply means "multiply with 1/100" and we remove the symbol from the above sentence, it does not convert back to the original sentence, "The probability increased from 0.1 to 0.15, or by 0.05" but will become "The probability increased from 0.1 to 0.15, or by 0.50".
Is my reasoning flawed in some way?