Is $E\left[\frac{1}{\sum_{i=1}^{n}X_i}\right]$ = $\frac{1}{\sum_{i=1}^{n} E\left[X_i \right]}$?

1.2k Views Asked by At

If $X_i$'s are i.i.d. random variables then is this statement true?

$$E\left[\frac{1}{\sum_{i=1}^{n}X_i}\right] = \frac{1}{\sum_{i=1}^{n} E\left[X_i \right]}$$

Here $E\left[X\right]$ is the expected value of a random variable $X$

Edit - I was thinking that if each $X_i$ corresponds to the result of an independent random experiment, then will the given equation be true or false? I intuitively feel that if we perform these $n$ experiments an infinite number of times then the denominator will by very close to $\sum_{i=1}^{n}E[X_i]$ for a majority of the time.

4

There are 4 best solutions below

1
On BEST ANSWER

Your statement is false even for $n=1$. Take, for instance, on $\{1,\dotsc, k\}$ the variable $X(j) = j$ for $j = 1,\dotsc, k$, with uniform probability measure $\mathbb{P}({j}) = 1/k$. Then $$ E\left [ \frac{1}{X} \right ] = \sum_{j=1}^k \frac{1}{k} \frac{1}{j} = \frac{1}{k} \sum_{j=1}^k \frac{1}{j} $$ while $$ \frac{1}{E[X]} = \frac{1}{\frac{1}{k}\sum_{j=1}^k j} = \frac{2k}{k(k+1)} = \frac{2}{k+1} $$ which are different.

However, if the question is $$ \lim_{n \to \infty} E \left [ \frac{1}{\sum_{i=1}^n X_i} \right ] = \lim_{n \to \infty} \frac{1}{E \left [ \sum_{i=1}^n X_i \right ]} $$ I do not know.

0
On

Take two fair-coins-like variables. $\mathbb{P}[X_i=2]=\mathbb{P}[X_i=1]=1/2$ for $i=1,2$. Then $$\mathbb{E}\left[\frac{1}{X_1+X_2}\right]=\frac{1}{4}\times \frac{1}{4}+\frac{1}{4}\times\frac{1}{2}+\frac{2}{4}\times \frac{1}{3}=17/48\ ,$$ while $$ \mathbb{E}[X_1+X_2]=\frac{1}{4}\times 4+\frac{1}{4}\times 2+\frac{2}{4}\times 3=3\ . $$

0
On

If it would be true then also it would be true that: $$\mathbb E\left(\frac1{X}\right)=\frac1{\mathbb EX}$$

for special case $n=1$.

There are plenty of counterexamples.

2
On

Your statement is false, as pointed out by other answers. But your intuition is right, in the sense that asymptotically (under some conditions) the equation is true. In general, for any "well behaved" function $Y=g(Z)$ we can do a Taylor expansion around the mean ($E[Z]=\mu_Z$) and take expectations; we get:

$$Y \approx g(\mu_Z) + g'(\mu_Z) (Z-\mu_Z) + \frac{1}{2}g''(\mu_Z) (Z-\mu_Z)^2 +\cdots \tag{1}$$

$$\mu_Y \approx g(\mu_Z) + \frac{1}{2!}g''(\mu_Z) \; m_{2,Z} + \frac{1}{3!} g'''(\mu_Z) \; m_{3,Z} \cdots \tag{2}$$

where $m_{k,Z}$ is the $k-$th centered moment of $Z$.

In your case define $g(Z)=1/Z$ and $Z=(X_1+X_2 +\cdots+X_n)/n$. If (only if!) $X_i$ have finite moments, only the first term in $(2)$ survives for large $n$, and

$$E\left[\frac{n}{X_1+X_2 +\cdots+X_n}\right] \approx \frac{1}{E\left[\frac{X_1+X_2 +\cdots+X_n}{n}\right]}=\frac{1}{E[X_i]}$$

Or

$$E\left[\frac{1}{X_1+X_2 +\cdots+X_n}\right] \approx \frac{1}{n E[X_i]}$$