I know very little calculus and I'm trying to understand this video from a MOOC I'm doing. I am trying to understand when at about 1:46 he says that $a$ doesn't approach infinity, but $1/a$ does. I though it was the exact opposite, am I taking something out of context? Please keep as little calculus in the answer as possible. Maybe one or two derivatives, integrals, or limits, but very little.
Thank you.
As $a$ decreases to $0$, then $1/a$ increases to $\infty$.
Imagine $a$ is some tiny tiny microscopic number, and you ask how many times $a$ goes into $1$. It's a very large number. And it can be made as large as you want by making $a$ small enough.