I am currently working on a drum sequencer program in C#. I want the user to be able to control the tempo via a trackbar. I am using a timer to control a progressbar and to trigger soundplayer events.
The timer's interval is 100 ms, thus I have the progressbar reset at max (default max value of 80 for a total duration of 8 seconds). By default, the value of the trackbar is at 120 bpm.
The trackbar ranges from 90 to 150. I have 4 bars at 120 BPM by default, which has a total duration of 8 seconds.
I having been scratching my head for sometime about the math for this. I have concluded that at:
Tempo = 90: Timer Interval = 125, thus a total time of 10,000 milliseconds. Tempo = 120: Timer Interval = 100, thus a total time of 8,000 milliseconds. Tempo = 150: Timer Interval = 75, thus a total time of 6,000 milliseconds.
So to me it appears like a linear relationship, for which I should multiply the trackbar value by something to change the timer interval.
But WHAT???
I calculated the ratios between interval and BPM, but it did not get me any further in understanding the relationship. They are as follows:
125/90 = 1.83 120/100 = 0.83 150/75 = 0.5
ANSWERS to comments: The trackbar is indeed setting the BPM. There are a total of 16 beats (4 bars with 4 beats).
I started out knowing that at 120 BPM these 16 beats will play for a total duration of 8 seconds because to convert BPM to milliseconds, the equation is 60,000 / BPM = MS.
Why 60,000? There are 60 seconds in a minute, and 1,000 MS in a second. So, 60 * 1,000 = 60,000. 60,000 divided by 120BPM = 500 milliseconds (.5 seconds)
500MS * 4 beats per measure = 2,000MS or 2 seconds per measure.
2 seconds * 4 measures = 8 seconds.
So based on that, I concluded (maybe incorrectly) that if I reduce the tempo by twenty five percent (value of 90) then the interval should increase by 25% (value of 125), thus increasing the total duration by 25% which is 10,000ms.
Hope this clarifies things.
Thanks for looking, I truly appreciate any help/direction on this problem!!!
You are correct that at $120$ BPM you will have each beat being $0.5$ seconds and $16$ beats will take $8$ seconds. If you reduce the beat rate to $90$ BPM, each beat will be $\frac {60}{90}=\frac 23$ seconds and $16$ beats will take $16 \cdot \frac 23\approx 10.667$ seconds. A reduction in BPM of $25\%$ does not increase the time by $25\%$, it increases it by $\frac 13 \approx 33.333\%$. Is that what is confusing you?