Since we know that in a good linear approximation, $L(x)=f(a)+f'(a)(x-a).$ But what if $f'(a)$ does not exist? How to prove that if a function has a good linear approximation, then it must be differentiable? Thanks a lot!
Definition of a good linear approximation: $f$ has a good linear approximation near $a$ when there exists a line with equation $y=L(x)$ and function $E(x)$ s.t. $f(x)=L(x)+E(x), \forall x$ at and near $a$ and $\lim\limits_{x \to a}\frac{E(x)}{x-a}=0.$
One definition of having a good linear approximation for $f$ at $a$ is $$ f(x)=f(a)+c(x-a)+\epsilon(x-a) $$ where $\epsilon(x-a)$, the error term, is a function that satisfies the following $$ \lim_{x\to a}\frac{\epsilon(x-a)}{x-a}=0 $$ Then, that $f$ is differentiable basically follows immediately, by subtracting $f(a)$, dividing by $x-a$ and taking the limit as $x\to a$ showing that the function is differentiable at $a$, with value $c$.
edit: I see now this is exactly how you define having a good linear approximation, so I will add more details to the above explanation, $$ f(x)=f(a)+c(x-a)+\epsilon(x-a) \implies f(x)-f(a)=c(x-a)+\epsilon(x-a)\\ \implies\frac{f(x)-f(a)}{x-a}=c+\frac{\epsilon(x-a)}{x-a} $$ and thus taking the limit $$ \lim_{x\to a}\frac{f(x)-f(a)}{x-a}=c+\lim_{x\to a}\frac{\epsilon(x-a)}{x-a}\\ \implies f'(a)=c $$