Suppose I'm given the following problem
Let $A \subseteq \mathbb{R}^m$ and let $f : A \to \mathbb{R}^n$. Let $a \in A$ and $u \in \mathbb{R}^n$ such that $u \neq 0$. Show that if $f_u'(a)$ exists, then $f_{cu}'(a)$ exists and $f'_{cu}(a) = c \cdot f_{u}'(a)$
I can prove the above as follows
Proof: Since $f_u'(a)$ exists then by definition $$\lim_{t \to 0} \frac{f(a+ut)-f(a)}{t}$$ exists and equals $f_u'(a)$. Then we have
\begin{align*} \lim_{t \to 0} \frac{f(a+tcu)-f(a)}{t} &= \lim_{t \to 0} \frac{cf(a+tcu)-cf(a)}{ct} \\ &= \lim_{t \to 0} \frac{c(f(a+tcu)-f(a))}{tc} \\ &= \lim_{s \to 0 } \frac{c(f(a+su)-f(a))}{s} \ \ \ \ \ \ \text{ where $s = tc$} \end{align*}
and now comparing to the fact that $\lim_{t \to 0} \frac{f(a+ut)-f(a)}{t}$ exists and equals $f'_u(a)$ it follows (using another limit theorem) that $$\lim_{s \to 0 } \frac{c(f(a+su)-f(a))}{s} = c \cdot \left(\lim_{s \to 0 } \frac{f(a+su)-f(a)}{s}\right) = c \cdot f_u'(a)$$ hence $f'_{cu}(a) = c \cdot f_{u}'(a)$ $\square$
But now above there's a point where I'd have to make the substitution $s = tc$, and I'd like to make my argument more formal based on the answer given here : https://math.stackexchange.com/a/167948/266135
This is my attempt to do that. If I let $\phi : \Theta \to \mathbb{R}^n$ (where $\Theta = \{ t \in \mathbb{R} \setminus \{0\} \ | \ a+tcu \in A\}$) be defined by $$\phi(t) = \frac{c(f(a+tcu)-f(a))}{tc}$$ and $g :\Gamma \to \mathbb{R}$ (where $\Gamma = \{tc \in \mathbb{R} \ | a+tcu \in A\}$) be defined by $g(tc) = tc$. Then $$\lim_{tc \to 0}g(tc) = 0.$$
Now I want to arrive at $$\lim_{tc \to 0}\phi(g(tc)) = \lim_{t \to 0}\phi(t)$$ but for that I need $\phi$ to be continuous at $\lim_{tc \to 0}g(tc) = 0$, however $\phi$ isn't even defined at $0$.
So my question is this, how could make my substitution argument in my proof above more formal?
I don't understand what all the fuss is here.
Fix $c\ne 0$ and let $g$ be any function defined in a punctured neighborhood of $0$ for which we have $\lim\limits_{s\to 0} g(s)=L$. Then have $\lim\limits_{t\to 0} g(ct) = \lim\limits_{s\to 0} g(s)=L$. You can check this easily from the $\delta$-$\epsilon$ definition. If $|g(s)-L|<\epsilon$ when $0<|s|<\delta$, then $|g(ct)-L|<\epsilon$ when $0<|t|<\delta/|c|$.