In a microcontroller (MSP430F5659) the RTCB timer provides clock-based interrupts at register-controlled intervals. These may be 1,2,4,8,16,32,64,128,256,512,1024 (and higher) interrupts per second. The same timer also provides a 1-second interrupt which is handy for evaluating the problem below.
Using one of the clock-based interrupts I'd like to update a variable called milliseconds. For reasons of efficiency and precision, I'm using the 1024 times per second interrupt.
Each interrupt represents 0.9765625 milliseconds. I don't want to store milliseconds as a float value, but rather as an integer.
I suppose incrementing a floating point variable by 0.9756525 and then rounding would do the trick, but I'd rather not add floating point processing overhead.
Therefore I need a mathematical function that will set my millisecond counter appropriately so it stays within +-1 ms. So, each time the interrupt fires, a decision is made whether or not the millisecond counter should be incremented (or wait until the next interrupt).
Let your master counter count interrupts rather than milliseconds. Then whenever someone needs a millisecond count, convert your interrupt count to milliseconds on the fly: $$ \mathit{millis} = \frac{1000}{1024}\mathit{intrs} = \mathit{intrs} - \frac{3\cdot \mathit{intrs}}{2^7} $$ where the division is just a bit shift.