The question is basically we have a continuous function $f:R\to R$ in which we have $$f(x)=\int_0^xf$$ for all real x, and it asks us to show that $f(x) = 0$ for all x real.
The major drawback for me is that I only have theorems about derivatives and Taylor/power series (without the use of integrals), some integral propeties (like the ones which relates absolute values, sums, subtractions, I haven't got theorems about change of variable or integration by parts, for example) and the first statement from the Fundamental Theorem of Calculus to use: (no mention to differential equations nor the second statement of the FTC)
If $f:[a,b]\to R$ is a continuous function and $F:[a,b]\to R$ is defined as $$F(x) = \int_a^xf$$ for all $x\in[a,b]$. Then we have $F'(x) = f(x)$ for all $x\in (a,b)$.
All those integrals are defined with the Darboux definition which takes into account $sup$ and $inf$ of $f$ in each interval that we partition and then take the $sup$ and $inf$ of the lower and upper sums to calculate the integral. (So we're not using the Rienmann definition)
I thought that I could use the fact that the function is infinitely-derivable since $f'(x) = f(x)$ because of the first part of FTC, then $f^{(n)}(x) = f(x)$ and, since $f(0) = 0$ (because it's a degenerate interval for the integral), we'd have $f^{(n)}(0) = 0$ for all natural $n$, then I thought that I could use the Taylor series for the function $f$ around $x=0$ which would be simply $T(x) = 0$ for all real $x$, but then I'd need to proof that $f$ is analytic for all real $x$, which I don't know how to... And the only theorem that my book has about that idea of being analytic is this one:
Let $f:I\to R$ be an infinitely-derivable function in a interval $I$ which contains $0$ in its interior. Let $(-r,r)$ be the biggest open interval in that form such that it's contained in $I$ and such that for each $c$ with $0<c<r$ we have: $$\lim_{n\to\infty}\left[\frac{1}{n!}M_n(c)c^n \right] = 0,$$ where $M_n(c)$ is the maximum of the function $f^{(n)}(x)$ in the interval $[-c,c]$. Then the Taylor series of the function $f(x)$, relatively to $x=0$, converges to $f(x)$ in the interval $(-r,r)$.
But honestly, I don't exactly know how to proceed with my idea of proof using that theorem... any idea would be very welcome and appreciated!
Hint: First show that it is zero on any interval $[0,b]$.
We have $f(0)=0$ and $f'(x)=f(x)$ so $|f'(x)|\leq |f(x)|$. Now this is an exercise in Rudin's Analysis book.
Let $M_0=\sup|f(x)|$ and $M_1=\sup|f'(x)|$ on $[0,b]$. Fix $x_0\in[0,b]$. For any $x\in[0,x_0]$, \begin{align} |f(x)|=|f(x)-f(0)|\stackrel{MVT}{=}|f'(c)|x\leq x_0|f(c)|\leq x_0M_0 \end{align} Taking the supremum on the left, we have $M_0 \leq x_0 M_0$, which either implies $M_0=0$, if you ensure that $x_0<1$. You can repeat the proof, starting at $x_0$ until you fill up the whole interval. Since $b$ is arbitrary, the claim holds on $[0,\infty)$. A similar proof holds in the other direction.
EDIT: This assumes you don't know the function $e^x$. Obviously, The solution to $f'(x)=f(x)$ is $f(x)=Ce^x$ and $f(0)=0$ ensures $C=0$.