This problem comes from a problem set that my calculus professor assigned to me.
Determine all functions $f: \mathbb{R} \to \mathbb{R}$ such that $$ f(x) = f\Bigl(\frac{x}{2}\Bigr) + f'(x)\, \frac{x}{2} \quad \forall x \in \mathbb{R}. $$ The answer is all linear ones, for the record.
I have an elementary, though ad-hoc, solution using specific supremum and infimum properties. What I wonder is whether or not problems of this type are prone to a specific or "known" method, and if so, how this method can be applied to this particular problem.
I tried to prove it in a more standard manner but didn't find something useful: I differentiated both sides two times and deduced $f''(0)=0$. Inductively, one can prove $f^{(n)}(0)=0$ for $n\geq 2$. Now, would Taylor series suffice to prove $f(x)=ax+b$?
Thanks in advance