I would like to prove that if $\gamma \in C^0_c(]a,b[,\mathbb{R})$ and $\epsilon >0$ then we can find $f \in C^{\infty}_c(]a,b[,\mathbb{R})$ and $K \geq 0$ such that $\|f-g\|_{\infty}<\epsilon$ and such that for every $x \in [a,b]$ where $\gamma'(x)$ exists then $\|\gamma'(x)-f(x)\|<K\epsilon$.
I tried using Stone theorem but I cannot control the condition of the derivative.
More in general, if $\Omega \subseteq \mathbb{R}^N$ is open, is it true that $C^{\infty}_c(\Omega,\mathbb{R})$ is dense in $C^0(\Omega,\mathbb{R})$?