Edit: the previous title was "Re-writing a function so it becomes differentiable"- I see now that this is not exactly possible. The actual question is to "show that this (problem below) can be written as a smooth constrained optimization function". I am still unsure of how to do this.
I have an optimization question with a non-differentiable function: $$ f(x) = \max(x^2, x)$$ The objective function is to minimise $f(x)$ for $x \in \mathbb{R}$.
I have re-written $f(x)$ as $$f(x) = \begin{cases} x & \mbox{for}\; x \in [0,1] \\ x^2 & \mbox{otherwise} \end{cases}$$
Is this now a smooth function? I don't feel like I've changed the function so how could it be smooth?