I came across the following problem in a prelim question paper. The question as stated seems meaningless to me, I am adding the picture so as to avoid any error from my end.

My case with the above question is that $f$ is given to be defined on $(0,1)$ so $f(x)$ makes sense only if $x\in (0,1)$. Although, it doesn’t tell s anything but if we assume therefore that the range is contained only in $(0,1)$. But then it would mean that a point $x$ is taken only $0$ times which is again a nonsense.
I am not sure if I am missing something or the question is really wrong. If it is wrong, what can be “the nearest correct” version of it? Meaning if we take the function to be defined from $\mathbb{R}$ to $(0,\infty)$?
As others have said, the question doesn't make sense as stated, but what we can prove is that if $f: (0,1) \rightarrow \mathbb{R}$ is continuous and $|f^{-1}(x)|\leq x^2$, then $f$ is differentiable a.e.. For that it will suffice to show that $f$ is differentiable a.e. on an interval of the form $[1/n, 1-1/n]$.
Note that $|f|$ is bounded on such an interval, say by $N\in\mathbb{N}$ so by the assumption $f$ attains any value at most $N^2$ times on $[1/n, 1-1/n]$. This implies that $f$ is of bounded variation on said interval: if $1/n=a_0<a_1<...<a_k=1-1/n$, note that by the intermediate value theorem, $f$ will for each $i$ attain all values in the interval $[f(a_{i-1},f(a_i)]$. Hence the length of all of these intervals cannot exceed $2N^3$. Indeed, if it did, then since each $[f(a_{i-1}),f(a_i)]$ is contained in the interval $[-N, N]$ (of length $2N$) the pigeonhole principle implies that at least one $x\in[-N,N]$ must occur in $N^2+1$ of the $[f(a_{i-1}),f(a_i)]$ intervals-this is a contradiction. Hence $\sum\limits_{i=1}^k|f(a_i)-f(a_{i-1})|\leq 2N^3$ so since $1/n=a_0<a_1<...<a_k=1-1/n$ was an arbitrary partition of $[1/n,1-1/n]$, $f$ is of bounded variation thereon. Hence it is a.e. differentiable on $[1/n,1-1/n]$.