How to prove $\sqrt{Y}$ can be variance stabilizing transformation of poisson distribution?

2.2k Views Asked by At

I am studying constant variance checking when conducting ANOVA.

I know that $\sqrt{Y}$ is one of the common transformations for a Poisson distribution, but I can't prove it. I also read Anscombe transform, but it was quite a different formation.

I think I have to use the delta method and a Taylor series but I get stuck in the very first step.

1

There are 1 best solutions below

0
On

In general, suppose a random variable $X$ has mean $\mu$ and a variance $\sigma^2(\mu)$ that is a function of $\mu.$ If $Y = f(X)$, then $Var(Y) \approx \sigma^2(\mu)[f^\prime(\mu)]^2.$ So choose $f$ so that $\sigma^2(\mu)[f^\prime(\mu)]^2$ is a constant.

In the Poisson case, $\sigma^2 = \mu$, and it follows that $f$ is the square root function.

In practice, taking square roots of Poisson data with different means does tend to make the variances more nearly equal without destroying the difference in the means. Here is a simulation:

 x1 = rpois(1000, 1);  x2 = rpois(1000, 4)
 mean(x1); var(x1); mean(x2); var(x2)
 ## 0.984
 ## 0.9747187
 ## 3.937
 ## 3.902934  # far from 0.97 above
 y1 = sqrt(x1); y2 = sqrt(x2)
 mean(y1); var(y1); mean(y2); var(y2)
 ## 0.7678424
 ## 0.3948129
 ## 1.906227
 ## 0.3036023  # not so far from 0.39 above

That said, in statistical practice and in many simulation experiments, I have never seen a situation in which taking square roots reveals a significant difference that was not already evident before the transformation. And, with transformed data, you are left trying to interpret means of square roots which are not the same as square roots of means (similarly for differences).

By contrast, the variance-stabilizing transformation for exponential data is to take logs, and that is extremely useful in practice.