Also Given that:
Suppose $f$ is a function such that $f(x)>0$ and $f^{'}(x)$ is continuous at every real number $x$
Now we can write $f(t)\ge \sqrt{f(0)}+\int_{0}^{t} \sqrt{f(t)} dt= \sqrt{f(0)}+\sqrt{f(1)}+\int_{1}^{t} \sqrt{f(t)} dt$
Now since $f(0)=0$
Thus $f(t)\ge \sqrt{f(1)}+\int_{1}^{t} \sqrt{f(t)} dt$
I am quite certain that this is wrong. Give me some hints to solve this please.
Solution :
In line with what @GAVD and @grand_chat suggested
"Hint: The Mean Value Theorem states $$ g(x) - g(1) = g'(t)(x-1) $$ for some $t$ between $1$ and $x$. The obvious choice for $g$ is $g(x):=\sqrt{f(x)}$."
We see $g^{'}(t)=\dfrac{f^{'}(t)}{2\sqrt{f(t)}}\ge \dfrac{1}{2}$
$\therefore \sqrt{f(x)}-\sqrt{f(1)}\ge \dfrac{1}{2}(x-1)$
To tell you honestly I also thought of it in lines of the mean value theorem but was confused with that
$g'(t)\ge \dfrac{1}{2}$
Hint: The Mean Value Theorem states $$ g(x) - g(1) = g'(t)(x-1) $$ for some $t$ between $1$ and $x$. The obvious choice for $g$ is $g(x):=\sqrt{f(x)}$.