Let $y=f(x)$ for $x,y \in R$
then
$$\dfrac{dy}{dx} = \lim_\limits{\Delta x \rightarrow 0} \dfrac{\Delta y}{\Delta x}$$
Where $\Delta y = f(x+h)-f(x)$ and $\Delta x = h$.
Derivative wants to capture the sensitivity of output w.r.t input. Then what is the issue if we define
$$ f'(x) = \lim_\limits{\Delta x \rightarrow 0} {\Delta y}$$
What is the significance of $\Delta x$ in the denominator in the original definition?
We are not capturing the change in output per unit change in input. We are capturing the change in output for infinitesimal change in input which is captured in both the definitions.
If we define derivative as you suggested $$f'(x) = \lim_\limits{\Delta x \rightarrow 0} {\Delta y}$$ then since $$ \Delta y = f(x+\Delta x)-f(x)$$ we will get $$f'(x) = \lim_\limits{\Delta x \rightarrow 0} {\Delta y}=0$$ for every continuous function.
I am sure you do not approve of such a definition.
On the other hand dividing $\Delta y $ by $\Delta x$ gives us an average rate of change that in limit is the instantaneous rate change, namely the derivative.