Suppose $f:\mathbb R \rightarrow \mathbb R$ is differentiable with $f(0)=f(1)=0$ and $\{x: f^{'}(x)=0\} \subset \{x: f(x)=0\}$. Show that $f(x)=0$ for all $x \in [0,1]$.
Since $f$ is differentiable on $[0,1]$, by Rolle's theorem, there exist a $c \in(0,1)$ such that $f^{'}(c)=0$. Let $A= \{x: f^{'}(x)=0\}$ and $B=\{x: f(x)=0\}$. Now, $c\in A \subset B$. This implies $f(c)=0$.
How I claim that $f(x)=0$ for all $x \in [0,1]$?
You are almost there! Note that since $[0,1]$ is compact, the function $f$ takes its highest and lowest values on the interval.
By proof of contradiction and WLOG, suppose that the maximal value that $f$ takes is $r>0$. So, there exists some $z\in[0,1]$ such that $f(z)=r$. Note that $z\in(0,1)$ since $f(0)=f(1)=0$.
Now suppose that $f^{'}(z)\neq0$. WLOG suppose it is greater than $0$. That implies that there is some $\epsilon>0$ such that $f(z+\epsilon)>f(z)$, a contradiction. So, $f^{'}(z)=0$, which is a contradiction.
So, the highest and lowest values the function takes on the interval are both $0$, proving the desired result.