I am writing some software that must do generalized time multiplication of various denominations. I can't wrap my head around how to generalize it.
[edit] The use case is this. You have a clock measuring time and you want to provide a mapping (ratio) to accelerate or decelerate the clock. For example, every seconds passed is equivalent to 1 days passed. Thus, $clocktime = xseconds * ndays$, with n days probably represented in seconds.
For example, lets say I have :
$1min * 2min = 2min\\ 60s * 120s = 7200s^2\\ 7200s^2 / 60s = 120s = 2min$
Now, if I multiply time in days, for example :
$1days * 2days = 2days$
And in seconds :
$1 day = 86'400s\\ 86'400s * 172'800s = 14'929'920'000s^2\\ 14'929'920'000s^2 / 60s = 248'832'000s$
But, that is incorrect.
$248'832'000s / 60 = 4'147'200min\\ 4'147'200min / 60 = 69'120h\\ 69'120h / 24 = 2'880days$
So my question is, how do you generalize time multiplication from/to various denominations? Ultimately, what am I missing here? Ty
This is a programming question, but ultimately you must encode the unit as well as its value. That way multiplying $ms\cdot ns = (m \cdot n) s$, and then handle conversions between units using a conversion function (with whatever templating mechanism you have in your language). For good examples see the time libraries in C++ and Rust (C++ is chrono and Rust is Duration I think)