if $f(f(x))=0$ has atleast one real root,and there be a real root $a$ of $f(x)$,is it necessary for the equation $f(x)=a$ to have atleast one real solution/root.if so why?
my contention is if $f(a)=0$,why does that force $f(x)=a$ to have a real root?
how do i approach this?any hint would be helpful
I presume $f$ is a polynomial in $x$ with real coefficients.
$f$ has a real root $a$, this means $f(a)=0$, or that, the monomial $(x-a)$ is a factor of $f$, thus we can write $f(x)=(x-a)g_0(x)$ for some polynomial $g_0(x)$
Now, notice that $(f\circ f)(x)=[f(x)-a]\cdot (g_0\circ f)(x)$
From the above, we can see that, for $f\circ f$ to have a real root, there must be a real solution to $f(x)-a=0$ or to $(g_0\circ f)(x)=0$.
($\ast$)Assume for the sake of contradiction that for $a$ a root of $f$, the equation $f(x)-a=0$ has no real solution. Then, it must be the case that $(g_0\circ f)(x)$ has at least one real root, call this root $r_0$, thus $(g_0\circ f)(r_0)=0$, ie, $g_0(f(r_0))=0$, ie, $f(r_0)$ is a root of $g_0$ but since $g_0$ is a factor of $f$, a root of $g_0$ must be a root of $f$, thus $a_0:=f(r_0)$ is a root of $f$. If $a_0=f(r_0)=a$, this contradicts the assumption that $f(x)-a=0$ has no real solutions (because $a_0$ is then a real solution to it)
Thus, $a_0=f(r_0)\ne a$, and $a,a_0$ are both real roots of $f$ and we can now write $f(x)=(x-a)(x-a_0)g_1(x)$ and $(f\circ f)(x)=[f(x)-a]\cdot [f(x)-a_0]\cdot (g_1\circ f)(x)$. Again, for $f\circ f$ to have a real root, if $f(x)-a=0$ or $f(x)-a_0=0$ have real solutions, we would get a contradiction since $a,a_0$ are both roots of $f$.
Since $f$ is a polynomial of necessarily finite degree, we can keep repeating the above argument until, say, we get to a point where $f(x)=(x-a)(x-a_0)\cdots (x-a_k)g_k(x)$ where $g_k$ does not have any real roots and when we apply the above argument again on this, we must have that, for $f\circ f$ to have a real root, one of the equations $f(x)-a=0, f(x)-a_0=0,\ldots, f(x)-a_k=0$ has a real solution since $g_k\circ f$ can't have a real root (since $g_k$ does not) and hence we reach a contradiction.
Thus, the assumption we made at $(\ast)$ cannot be true, and if $f$ has real roots, for $f\circ f$ to have a real root, there exists at least one real root $a$ of $f$ such that $f(x)-a=0$ has a real solution.
In conclusion, this only shows that if $f\in\Bbb R[x]$ has any real roots, for $f\circ f$ to have a real root, it is necessary that there exists at least one real root $a$ of $f$ such that $f(x)-a=0$ has a real solution.
This, however, does not imply that $f(x)-a=0$ has a real solution for all real roots $a$ of $f$, which is false, as illustrated by the counterexample given in the answer by @roadrunner (in his example, $0,2$ are the real roots of $f$ as well as $f\circ f$ but only $f(x)=0$ has a real solution, the equation $f(x)=2$ does not). As such, $f(a)=0$ does not force $f(x)=a$ to have a real solution.