How do you interpret this parametrization of variance?

32 Views Asked by At

Just learning about random variables. I have a stochastic variable with variance

$$\frac{a \cdot b}{ (a+b)^2( a + b + 1)}$$

and mean $$ \frac{a}{a + b}$$

where a,b > 0, and I am looking at the statistic

$$S = \frac{1}{a + b + 1}$$

How do I interpret this statistic? It lies in (0,1) interval, but if its higher, does it mean I have more variance, or less? Or does it signify something else?

1

There are 1 best solutions below

0
On

This only answers the second of your 3 questions.

Taking $a$ as a settable parameter, and the variance as a function of $b$ then the variance increases in the range $0<b<L(a)$. Take $a= .1$ for example, then $L \approx .086$. If $a \approx 1.7$ then variance increases when $0<b<1$ but the range (and rate of change) of the variance is smaller. In general as $a$ gets bigger the range in which increasing $b$ causes variance to increase is larger, but the variance curve WRT $b$ becomes flatter and closer to zero.