I am studying constant variance checking when conducting ANOVA.
I know that $\sqrt{Y}$ is one of the common transformations for a Poisson distribution, but I can't prove it. I also read Anscombe transform, but it was quite a different formation.
I think I have to use the delta method and a Taylor series but I get stuck in the very first step.
In general, suppose a random variable $X$ has mean $\mu$ and a variance $\sigma^2(\mu)$ that is a function of $\mu.$ If $Y = f(X)$, then $Var(Y) \approx \sigma^2(\mu)[f^\prime(\mu)]^2.$ So choose $f$ so that $\sigma^2(\mu)[f^\prime(\mu)]^2$ is a constant.
In the Poisson case, $\sigma^2 = \mu$, and it follows that $f$ is the square root function.
In practice, taking square roots of Poisson data with different means does tend to make the variances more nearly equal without destroying the difference in the means. Here is a simulation:
That said, in statistical practice and in many simulation experiments, I have never seen a situation in which taking square roots reveals a significant difference that was not already evident before the transformation. And, with transformed data, you are left trying to interpret means of square roots which are not the same as square roots of means (similarly for differences).
By contrast, the variance-stabilizing transformation for exponential data is to take logs, and that is extremely useful in practice.