Consider a diffrentiable function on an interval $[a,b]$ where $f(a)=f(a+h)=0$ then by mean value theorem $f'(c)=0$ where $a<c<a+h$
let $$\lim \limits_{h\to 0}{h=0}$$
then by sandwitch theorem $c=a$
so that $f'(a)=0$
This is absurd I must have done a mistake somewhere.By the way I found the idea from the slope of the graph of $sin\frac{1}{x}$ near $0$
It's not absurd. If $f\colon [a,b]\to\mathbb{R}$ is differentiable in $a$, and if there is a sequence $(h_n)$ in $(0, b-a)$ with $h_n \to 0$ and $f(a+h_n) = f(a)$ for all $n$, then indeed you have $f'(a) = 0$. You don't need the mean value theorem or any squeezing for that, it follows directly from the definition.
But you need the differentiability of $f$ in $a$, of course.
Your argument with the mean value theorem shows that if $f$ is differentiable everywhere, then the existence of a sequence $(h_n)$ as above implies that there are $c_n\in (a,b)$ with $c_n \to a$ and $f'(c_n) = 0$. From that last piece alone, you can only infer $f'(a) = 0$ if the derivative is continuous in $a$.
For the example you got the idea from, $\sin \frac1x$, you have the problem that you cannot extend that function to $0$ so that it even is continuous in $0$, let alone differentiable. If you squeeze down the amplitude of the oscillation near $0$ so that you get a differentiable function, for example $f(x) = x^2\sin\frac1x$ (and $f(0) = 0$, of course), then you have $f'(0) = 0$, but for this example, the derivative $f'$ is not continuous in $0$, so the squeezing argument cannot show that $f'(0) = 0$.