Let $f\in C^1(\mathbb{R})$, and $y,y_n\in L^2(0,1)$, assuming that $$\|y_n-y\|_{L^2(0,1)} \longrightarrow 0,$$ can we deduce that $$\|f(y_n)-f(y)\|_{L^2(0,1)} \longrightarrow 0 ?$$
I think it is not true we only have to choose the particular case where $f(s)=s^2$ and adequat $y_n$ and $y$ in $L^2(0,1).$
And if we add the following hypothesis $f \in L^{\infty}(\mathbb{R})$, can we deduce then that $$\|f(y_n)-f(y)\|_{L^2(0,1)} \longrightarrow 0 ?$$
Since $y_n \to y$ in $L^2$, by Chebyshev's inequality we have in particular that $y_n \to y$ in measure. By the continuous mapping theorem, since $f$ is continuous we also have $f(y_n) \to f(y)$ in measure (note the link uses the term "convergence in probability" instead; here they are equivalent). And since $f$ is bounded, the functions $f(y_n)$ are uniformly bounded by a constant, and constants are square-integrable on $[0,1]$, so by the dominated convergence theorem, $f(y_n) \to f(y)$ in $L^2$.
We didn't need any assumptions about the derivative of $f$.