I was reading my textbook and I came across this proof for why an increase in the index $n$ for the function $f(x)=x^n$ makes the graph stay closer to the $x$-axis between $x=0$ and $x=1$ but then climbs more steeply beyond $x=1$.
It states in the proof that $x^{n+1}=x\cdot x^n$. Hence:
- $x^{n+1}>x^n$ when $0<x<1$ and $x^{n+1}>x^n$ when $x>1$
But why is this the case? I can't seem to quite understand the intuition behind it.
Here's a quick proof:
Let $0 < x < 1$, and $y > 0$. Let's choose $\Delta x$ such that $x + \Delta x = 1$, and it isn't too hard to show that $0 < \Delta x < 1$. Now:
$y$ and $\Delta x$ are both positive, so $y \Delta x$ is positive.
$xy = (1 - \Delta x) y = y - y \Delta x$, but since $y \Delta x$ is positive we're subtracting a positive value, meaning $xy < y$.
For example, if $x = 0.7$, then $\Delta x = 0.3$, and $0.7 y = y - 0.3 y < y$.