Let $p,q\in \mathbb{R}$ such that $p+q=1$ and $\Delta t, \Delta x\in \mathbb{R}$
I would like to understand why exactly $\dfrac{(p-q)}{\Delta x}$ must be bounded and not $\frac{(p-q)}{\Delta t}$, as an example, In other words, why: $$\frac{(p-q)\Delta x}{\Delta t}<+\infty \iff \frac{(p-q)}{\Delta x} \mbox{is bounded}$$ Is there any concept or theorem in the analysis that says so? Please mention it in your explanation. and why ? $$\frac{(p-q)}{\Delta x} \mbox{is bounded} \implies \lim_{\Delta t,\Delta x \to 0}(p-q)=\lim_{\Delta x \to 0}\Delta x=0$$ Thanks in advance
Addition:
I found $$\lim_{\Delta t, \Delta x \to 0}\frac{(p-q)}{\Delta x}=\frac{c}{D} \implies \lim_{\Delta t, \Delta x \to 0}(p-q)=\lim_{\Delta t, \Delta x \to 0} \Delta x=0$$ but still can't know why.

