Suppose we have $X_1,\cdots,X_n$ be independent Bernoulli random variables with $n$ odd, $P(X_i=1)=p_i$ and $p_i \in [1/2,1]$ for all $i=1,\cdots,n$. I want to prove that: \begin{equation} P(\sum_{i=1}^n X_i\geq\frac{n+1}{2}) \geq \frac{\sum_{i=1}^np_i}{n} \end{equation} This translates to the following statement: The probability that the majority of the $X_i$ are $1$ is bigger than or equal to the average of the probabilities. I strongly believe this is true, but I am not sure how a probabilistic proof would go, it seems that a reverse Markov inequality or Jensen's inequality can not give me this result. Any help would be greatly appreciated!
2026-02-23 15:34:25.1771860865
Lower bound for the distribution of the sum of independent Bernoulli variables ( not i.i.d)
284 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY-THEORY
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Another application of the Central Limit Theorem
- proving Kochen-Stone lemma...
- Is there a contradiction in coin toss of expected / actual results?
- Sample each point with flipping coin, what is the average?
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Determine the marginal distributions of $(T_1, T_2)$
- Convergence in distribution of a discretized random variable and generated sigma-algebras
Related Questions in INEQUALITY
- Confirmation of Proof: $\forall n \in \mathbb{N}, \ \pi (n) \geqslant \frac{\log n}{2\log 2}$
- Prove or disprove the following inequality
- Proving that: $||x|^{s/2}-|y|^{s/2}|\le 2|x-y|^{s/2}$
- Show that $x\longmapsto \int_{\mathbb R^n}\frac{f(y)}{|x-y|^{n-\alpha }}dy$ is integrable.
- Solution to a hard inequality
- Is every finite descending sequence in [0,1] in convex hull of certain points?
- Bound for difference between arithmetic and geometric mean
- multiplying the integrands in an inequality of integrals with same limits
- How to prove that $\pi^{e^{\pi^e}}<e^{\pi^{e^{\pi}}}$
- Proving a small inequality
Related Questions in UPPER-LOWER-BOUNDS
- Bound for difference between arithmetic and geometric mean
- Show that $\frac{1}{k}-\ln\left(\frac{k+1}{k}\right)$ is bounded by $\frac{1}{k^2}$
- Bounding Probability with Large Variance
- Connectivity of random graphs - proof $\frac{logn}{n}$ is threshold
- Natural log integral inequality
- Spectrum of a matrix after applying an element-wise function (e.g. elementwise log)
- Majorization form for a given set of integers in some interval.
- Proving $(λ^d + (1-λ^d)e^{(d-1)s})^{\frac{1}{1-d}}\leq\sum\limits_{n=0}^\infty\frac1{n!}λ^{\frac{(d^n-1)d}{d-1}+n}s^ne^{-λs}$
- Upper bound for distribution function of the standard normal distribution
- Show $0 < f'(x) \leqslant \frac{1}{2}$
Related Questions in BERNOULLI-DISTRIBUTION
- Bound for sum of iid Bernoulli distributions
- Decompose integrable probability measure into Bernoulli distribution
- Definition of a two state random variable
- Finding the distribution of a random variable that is determined by a Bernoulli random variable
- Correlation of a binomial RV with itself
- Probabilities to have $x$ women in a four-person committee
- Probability of blue and red ball
- Tail probability of summation of (possibly dependent) Bernoulli variables
- Showing that randomness is undesired for prediction accuracy
- Estimate minimum number of repititions in Bernoulli trial
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
The inequality suggested indeed holds, but the only proof I found takes some work.
Let ${\bf p}:=(p_1,\ldots,p_n)$ and $S:=\sum_{i =1}^n X_i$. Define \begin{equation} f({\bf p}):={\mathbb P}_{\bf p}\Bigl(S \geq\frac{n+1}{2}\Bigr) - \frac{\sum_{i=1}^np_i}{n}\,, \end{equation} so the goal is to show that $f$ is nonnegative on the cube $Q=[1/2,1]^n$.
Since $f$ is continuous on the compact set $Q$, it attains a minimum there. Denote $S_{-k}=S-X_k$. Observe that $f$ is a polynomial in the variables $p_k$, and the first term in its definition is the probability of the increasing event $\{S \ge (n+1)/2\} \,.$
Thus by Russo's formula [1], [2] for the derivative of the probability of an increasing event, or by direct differentiation, $$\frac{\partial f}{\partial p_k}({\bf p})={\mathbb P}_{\bf p}\Bigl(S_{-k}=\frac{n-1}{2}\Bigr) - \frac{1}{n} $$ $$={\mathbb P}_{\bf p}\Bigl(S_{-k} \ge \frac{n-1}{2}\Bigr) - {\mathbb P}_{\bf p}\Bigl(S_{-k} \ge \frac{n+1}{2}\Bigr)-\frac{1}{n}\,. $$ A second application of Russo's formula yields that $$\frac{\partial^2 f}{\partial^2 p_k}({\bf p}) ={\mathbb P}_{\bf p}\Bigl(S_{-k} =\frac{n-3}{2}\Bigr) - {\mathbb P}_{\bf p}\Bigl(S_{-k} = \frac{n-1}{2}\Bigr) \,. \tag{*} $$ By a Theorem of Darroch [3], reproved in [4] (See also [5] for the most convenient reference) the distribution of the Poisson-Binomial variable $S_{-k}$ is unimodal, with one or two adjacent modes (=peaks). Moreover, its lower mode differs by at most $1$ from its mean $${\mathbb E}(S_{-k})=\sum_{i \ne k} p_i \ge \frac{n-1}{2} \,.$$
Therefore, the right hand side of $(*)$ is non-negative, so it follows from $(*)$ that $f$ is concave in each variable $p_k$ separately.
(Remark: In [6] it is shown that the distribution of $S_{-k}$ is strictly log concave, which implies that $f$ is strictly concave in each variable, but we do not need that.)
Thus $f$ attains its minimum at some extreme point ${\bf p^*}$ of $Q$ where $p_k^* \in \{1/2,1\}$ for all $k$. Given such ${\bf p^*}$ where $p^*_k =1/2$ for exactly $\ell$ values of $k$, we find that $$f({\bf p^*})=\mathbb P\Bigl(\text{Bin}(\ell,1/2) \ge \frac{n+1}2-(n-\ell)\Bigr)-\frac{n-\ell+\ell/2}{n}$$ $$ = \frac{\ell}{2n}-\mathbb P\Bigl(\text{Bin}(\ell,1/2) \le \frac{n-1}2-(n-\ell)\Bigr) $$ $$\ge \frac{\ell}{2n}-\mathbb P\Bigl(\text{Bin}(\ell,1/2) \le \frac{\ell-1}{2}\Bigr) \ge 0$$ by unimodality of the binomial coefficients.
[1] https://arxiv.org/pdf/1102.5761.pdf page 29
[2] L. Russo, An approximate zero-one law, Zeitschrift fur Wahrscheinlichkeitstheorie und Verwandte Gebiete 61 (1982), 129–139.
[3] J. N. Darroch. On the distribution of the number of successes in independent trials. Ann. Math. Statist., 35, 1964.
[4] S. M. Samuels. On the number of successes in independent trials. Ann. Math. Statist., 36:1272–1278, 1965.
[5] https://arxiv.org/pdf/1908.10024.pdf eq. (2.2)
[6] http://www3.stat.sinica.edu.tw/statistica/oldpdf/A3n23.pdf eq. (20)