I am reading an electrical engineering textbook that states that the relationship between current $i$, charge $q$, and time $t$ is
$$i = \dfrac{dq}{dt} \tag{1}$$
Based on this, the authors then state that
$$\Delta q = i \Delta t \tag{2}$$
This set off alarm bells in my head. Now, (2) may actually be true, but using (1) as some kind of implication for (2) just seems like incorrect mathematics. If I had to fill in the blanks of the authors' thinking, it seems to me that they were likely rationalising this through the derivation of the derivative
$$\dfrac{df}{dx} = \lim_{\Delta x \to 0} \dfrac{f(x + \Delta x) - f(x)}{\Delta x} = \lim_{h \to 0} \dfrac{f(x + h) - f(x)}{h} = \lim_{\Delta x \to 0} \dfrac{\Delta y}{\Delta x}$$
But, nonetheless, I don't see how, mathematically, $i = \dfrac{dq}{dt}$ implies $\Delta q = i \Delta t$.
Am I mistaken here, or is this actually incorrect mathematics used as some kind of hand-wavy justification for an engineering equation?
Integrate both side by $dt$ from $t = 0$ to $t=Δt$. You should get something like this, $$ i Δt = \int_{q(t=0)} ^{q(t=Δ t)} dq = q(Δ t) - q(0),$$ which you can define $Δq = q(Δ t) - q(0)$.
Oh, I just see that $i$ need to be time-independent, otherwise this won't work. Anyway, this should confirm your doubt.