Relation between $g(\mathbb{E}[X])$ and $\mathbb{E}[g(X)]$ when $g(X)=\frac{X}{1-X}$ is the "odds" function

91 Views Asked by At

Some context: Let's say we draw from an urn containing red and blue balls. We start at $n=1$, draw 1 ball, look at the color, and put it back, and repeat the process at $n=2$, $n=3$, etc., and every time we encounter a red ball we increment a counter $r$. Thus, the counter $b$ of blue balls is simply $b=n-r$. With every observation, we have a new estimate $X(n,r)$ of some other probability. In other words, $X$ is a function of $n$ and $r$ which returns a real number in $[0,1]$. As such, $X$ is the expression of a probability, but it can also be expressed as the odds ratio $\frac{X}{1-X}$.

Question: Although both points of view for this probability are equivalent, the expectation of this probability for $n$ is different in both cases. This is because $\mathbb{E}[g(X)]\neq g(\mathbb{E}[X])$ in general. I believe that Jensen's inequality is at play here and yields $\mathbb{E}\left[\frac{X}{1-X}\right]>\frac{\mathbb{E}[X]}{1-\mathbb{E}[X]}$ because the odds transfomation is convex, and I am wondering if there is any way to use a "convexity correction" of some sort to go from one expectation to the other?

Expression: Going back to the urn example, suppose the probability of drawing a red ball is $p$, the probability mass function is binomial and we can express both expectations as (I think): $$ \mathbb{E}\left[X(n,r)\right]=\sum\limits_{r=0}^n\binom{n}{r}p^r(1-p)^{n-r}X(n,r)\\ \mathbb{E}\left[\frac{X(n,r)}{1-X(n,r)}\right]=\sum\limits_{r=0}^n\binom{n}{r}p^r(1-p)^{n-r}\frac{X(n,r)}{1-X(n,r)} $$ Is there any way to go from, say, the expectation of the odds $\mathbb{E}\left[\frac{X}{1-X}\right]$ to the expectation of the probability $\mathbb{E}[X]$?

The only thing I have found so far are pages 2 and 3 of this appendix, but it basically states the same problem I am encountering without proposing a way to solve it.

2

There are 2 best solutions below

4
On BEST ANSWER

I'll assume you're trying to estimate the fraction $p$ of red balls in the urn. Thus after $n$ drawings with $r$ red balls your estimator is $r/n$. I'll write that as $X = R/n$ where $R$ is a random variable with binomial distribution having parameters $n$ and $p$. We have $\mathbb E[X] = \mathbb E[R]/n = p$. On the other hand, $\mathbb E[X/(1-X)]$ does not exist because there is a nonzero probability of $X=1$ (i.e. all balls drawn were red), leading to a division by $0$.

EDIT: If $Y = X/(1-X)$ then $X = Y/(1+Y)$. Assuming $|Y| < 1$ with probability $1$ (i.e. $X < 1/2$), we can expand this in a convergent series $X = Y - Y^2 + Y^3 - Y^4 + \ldots$, so $$\mathbb E[X] = \mathbb E[Y] - \mathbb E[Y^2] + \mathbb E[Y^3] - \mathbb E[Y^4] + \ldots$$ But in any case you need more information than just $\mathbb E[Y]$.

3
On

$E\left[{X \over 1-X}\right]$ bakes in higher order moment information than just $E[X]$. In particular, assuming $|X|<1$ with probability $1$, then

$$E\left[{X \over 1-X}\right]=E\left[\sum_{i=1}^\infty X^i\right]=\sum_{i=1}^\infty E\left[X^i\right].$$