Let $a\in(-1,1)$ and $g\in C^{\infty}(\mathbb{R}, \mathbb{R})$.
Let $S(a, g)$ the set of all f such that : $$f(x) = \int_0^{ax}f(t)dt + g(x)$$
The first part was to show that : $$S(a, 0) = \{0_{\mathbb{R} \to \mathbb{R}} \}$$ Which I did using Taylor-Lagrange inequality and other things.
The second part is deducting from that that $S(a, g)$ have at most 1 element. I don't know what to start with, could you give me some hints, please?
I am still thinking about the first part.
But to prove the second part, suppose $f_1,f_2\in S(a,g)$. Then $$f_i(x)=\int_0^{ax}f_i(t)dt+g(x)\mbox{ for }i=1,2.$$ Subtract the first equation by the second one, we get $$f_1(x)-f_2(x)=\int_0^{ax}f_1(t)-f_2(t)dt.$$ That is to say, $f_1-f_2\in S(a,0)$. Using the first part that $S(a, 0) = \{0_{\mathbb{R} \to \mathbb{R}} \}$, we have $f_1=f_2$. This implies that $S(a,g)$ can have at most one element.