Bernoulli trials with sum of probabilities of success one

155 Views Asked by At

The following problem looks familiar and is probably a known result. Consider $N$ independent Bernoulli experiments with probabilities of success $p_i$ such that $\sum_{i=1}^N p_i = 1$.

What are the values of $p_i$ that minimize the probability to have at least one success? What is the minimum probability as $N$ goes to infinity.

My guess is that $p_i = 1/N$ with probability of at least one success $1 - (1 - 1/N)^N$ and the limit is $1 - e^{-1}$.

2

There are 2 best solutions below

0
On

I presume the experiments are independent.

The probability of at least one success is $1 - (1-p_1)(1-p_2)\cdots(1-p_N)$. If you minimize this with the constraint $\sum_{i=1}^N p_i = 1$ (e.g. using Lagrange multipliers), you will get your guess $p_1 = \cdots = p_N = 1/N$. The rest of your work is then correct.

0
On

Let's calculate the probability to get no success (NS) when they are independent:

$$P_{NS} = \prod_{i=1}^N (1-p_i).$$

Then, the probability to have at least one success (OS) is:

$$P_{OS} = 1 - P_{NS} = 1 - \prod_{i=1}^N (1-p_i).$$

You want to minimize $P_{OS}$ w.r.t. $\{p_i\}$ subject to $\sum_{i=1}^N p_i = 1$.

Notice that, this problem is equivalent to solve the following:

Maximize $$1 - P_{OS} = 1 - \left(1 - \prod_{i=1}^N (1-p_i)\right) = \prod_{i=1}^N (1-p_i)$$ w.r.t. $\{p_i\}$ subject to $\sum_{i=1}^N p_i = 1$.

Moreover, if we apply to the objective function $1-P_{OS}$ a monotone function like $\log^{x}$, the problem is still the same:

Maximize $$\log\left({1 - P_{OS}}\right) = \log{\displaystyle \prod_{i=1}^N (1-p_i)}$$ w.r.t. $\{p_i\}$ subject to $\sum_{i=1}^N p_i = 1$.

Now, notice that:

$$\log{\displaystyle \prod_{i=1}^N (1-p_i)} = \sum_{i=1}^N \log{(1-p_i)}.$$

The problem now is:

Maximize $$\log\left({1 - P_{OS}}\right) = \sum_{i=1}^N \log{(1-p_i)}$$ w.r.t. $\{p_i\}$ subject to $\sum_{i=1}^N p_i = 1$.

One way to solve this problem is to use Lagrange multipliers. Another road is to "embed" the constraint $\sum_{i=1}^N p_i = 1$ in the objective function:

$$\sum_{i=1}^N \log{(1-p_i)} = \sum_{i=1}^{n-1} \log{(1-p_i)} + \log\left(1-\left(1 - \sum_{i=1}^{n-1}p_i\right)\right) = \\ = \sum_{i=1}^{N-1} \log{(1-p_i)} + \log\left(\sum_{i=1}^{N-1}p_i\right).$$

The derivative of this function with respect to $p_k$, $1 \leq k \leq N-1$ is:

$$-\frac{1}{1-p_k} + \frac{1}{\sum_{i=1}^{N-1}p_i}.$$

If we pose it to be equal to $0$, we get:

$$\sum_{i=1}^{n-1}p_i = 1- p_k ~\forall 1 \leq k \leq N-1 \Rightarrow \\ 1 - p_{N} = 1- p_k ~\forall 1 \leq k \leq N-1 \Rightarrow \\ p_{N} = p_k ~\forall 1 \leq k \leq N-1.$$

The only way to satisfy the previous equalities is that $p_k = p_N = \frac{1}{N}.$