In the case that non-negative random variables $X_i$ are i.i.d we have $$\mathbb{E}\frac{X_i}{X_1+\dots+X_n} = \frac{1}{n}.$$ What can be said in the non-identical case? Specifically, if $X_i\geq 0$ are independent (but not identically distributed), can we say that $$\mathbb{E}\frac{X_i}{X_1+\dots+X_n}$$ is close to $$\frac{\mathbb{E}X_i}{\mathbb{E}X_1 + \dots + \mathbb{E}X_n}$$ in, say, absolute value (where this might depend on the variance of the $X_i$)? Note that if $X_i\sim\text{Gamma}(\alpha_i,1)$ this holds with equality.
2026-03-28 08:29:31.1774686571
On
Expected Proportion of Random Variables
116 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
4
On
For $\epsilon,K>0$ let $$ X_1 = \begin{cases} 2 & \text{with prob. } \tfrac{1}{2},\\ 2\epsilon & \text{with prob. } \tfrac{1}{2}, \end{cases}\qquad X_2 = \begin{cases} K & \text{with prob. } \tfrac{1+\epsilon}{K},\\ 0 & \text{otherwise.} \end{cases} $$ Then $\mathbb{E} X_1 = \mathbb{E} X_2 = 1+\epsilon$ but $$ \biggl|\mathbb{E}\frac{X_1}{X_1 + X_2} - \frac{\mathbb{E}X_1}{\mathbb{E}X_1 + \mathbb{E}X_2}\biggr| \geq \bigl|\tfrac{1}{2} - \tfrac{1+\epsilon}{K}\bigr| \to \frac{1}{2} $$ as $K\to\infty$.
$\newcommand\E{\mathbb{E}}$Denote $\sigma_i = \E|X_i - \E X_i|$, $X_{-k} = X_1 + \dots + X_{k-1} + X_{k+1} + \dots + X_n$, and assume $X_i>0$ almost surely. We can prove the following Efron-Stein-looking inequality, with nicer bounds if we are guaranteed $X_i\geq m>0$ almost surely: $$ \biggl|\E\biggl[\frac{X_i}{X_1+\dots+X_n}\biggr] - \frac{\E X_i}{\E X_1 + \dots + \E X_n}\biggr| \leq \sum_{j=1}^n \E\frac{\sigma_j}{X_{-j}} \leq \frac{1}{(n-1)m}\sum_{j=1}^n\sigma_j $$ and a somewhat tighter $\ell^1$ bound for the corresponding simplex vector $$ \biggl\|\E\biggl[\frac{X_i}{X_1+\dots+X_n}\biggr] - \frac{\E X_i}{\E X_1 + \dots + \E X_n}\biggr\|_1 \leq 2\sum_{j=1}^n \E\frac{\sigma_j}{X_{-j}} \leq \frac{2}{(n-1)m}\sum_{j=1}^n\sigma_j $$ Hopefully someone can make this bound in terms of $\E X_j$ instead of the expected reciprocal so I won't accept this for a while.
Proof. Compute $$ \begin{align*} \E\biggl[\frac{X_i}{X_1+\dots+X_n}\biggr] - \frac{\E X_i}{\E X_1 + \dots + \E X_n} &= \E\biggl[\frac{X_i(\E X_1+\dots + \E X_n) - (X_1+\dots+X_n)\E X_i}{(X_1+\dots+X_n)(\E X_1 + \dots + \E X_n)}\biggr]\\ &= \E\biggl[\frac{\sum_{j\neq i}X_i\E X_j - X_j\E X_i}{(X_1+\dots+X_n)(\E X_1 + \dots + \E X_n)}\biggr]\\ &= \frac{1}{\E X_1 + \dots + \E X_n}\sum_{j\neq i} \E\biggl[\frac{X_i\E X_j - X_j \E X_i} {X_1 + \dots + X_n}\biggr]. \end{align*} $$ Using Jensen we can bound $$ \begin{align*} \biggl|\E\frac{X_i\E X_j - X_j \E X_i}{X_1 + \dots + X_n}\biggr| &\leq \E\frac{|X_i\E X_j - X_j \E X_i|}{X_1 + \dots + X_n}\\ &\leq \E\frac{|X_i - \E X_i|\E X_j + |X_j - \E X_j|\E X_i}{X_1 + \dots + X_n} \end{align*} $$ and, by non-negativity and independence, $$ \begin{align*} \E\frac{|X_i - \E X_i|\E X_j}{X_1 + \dots + X_n} &\leq \E\frac{|X_i - \E X_i|\E X_j}{X_1 + \dots + X_{i-1} + X_{i+1} + \dots + X_n}\\ &= \E\frac{\E |X_i - \E X_i| \E X_j}{X_1 + \dots + X_{i-1} + X_{i+1} + \dots + X_n}\\ &= \E\frac{\sigma_i \E X_j}{X_1 + \dots + X_{i-1} + X_{i+1} + \dots + X_n}. \end{align*} $$ Hence $$ \begin{align*} \biggl|\E\biggl[\frac{X_i}{X_1+\dots+X_n}\biggr] - \frac{\E X_i}{\E X_1 + \dots + \E X_n}\biggr| \leq \E\frac{\sigma_i}{X_{-i}} + \frac{\E X_i}{\E X_1 + \dots + \E X_n}\sum_{j\neq i} \E\frac{\sigma_j}{X_{-j}}\leq \sum_{j=1}^n \E\frac{\sigma_j}{X_{-j}}. \end{align*} $$