A friend once noted that the temperature had doubled from morning to afternoon, from 42 degrees to 84 (Fahrenheit; this was in the U.S.).
I didn't contradict her, but thought to myself that wasn't really true, because the actual doubling of 42 degrees would be the span from absolute zero to 42 multiplied by 2. So for the temperature to double, we would have to have long before that burned to a crisp.
Since "0" is a seemingly random "starting point," it really shouldn't figure into a calculation of when a temperature has doubled, correct?
But we do, of course, say that if you get a raise from \$22 per hour to \$44 per hour, your salary has doubled. Because 0 is a solid basis from which to begin.
Are my assumptions valid?
The only thing that makes sense in this context is the absolute zero. That's also what makes, for instance, the ideal gas law look the nicest (temperature doubles and volume stays the same means the pressure doubles, and so on). That means that twice as warm as the temperature at which water freezes turns out to be about $273^\circ C$, or $524^\circ F$.
You get more or less the same thing happening with altitudes. Height above sea level is kindof arbitrary, and doesn't actually correspond to actual height above sea level most of the time because of wind and tides. And in the middle of a continent, what is sea level, really? So if you have two mountains, and say that one is twice as tall as the other, does that make sense in any objective manner?