Let $f$ be an analytic function such that $$f(z)=z+f(z^2)$$
Let $a_n=f^{n}(0)/n!$
Then find the radius of convergence of $\sum_{n=1}^{\infty}{a_n}z^{n}$
$f'(z)=1+f'(z^2)2z$ so $a_1=1$
$f''(z)=f'(z^2)2+f''(z^2)4z^2$ so $a_2=2/2!$
But this process become lengthy as we go to higher derivatives. Is there a more cleaver and interesting way to solve this problem?
Thanks
Comparing coefficients we get $a_n=a_{2n}, n \geq 1$ and $a_1=1$. Hence $a_{2^{n}}=1$ for all $n$ and $a_n=0$ for $n$ odd, $n \neq 1$. . The radius of convergence $R$is given by $\frac 1 R=\lim \sup |a_n|^{1/n}$ and hence $R=1$.