Determining expected value with Linearity of Expectation

308 Views Asked by At

A store sells n flavors of chips. When you place an order the store sends you 1 bag of chips, chosen uniformly at random from the n different flavors, independently of previous orders. Tiffany would like to try all flavors of chips so she repeatedly places orders from the store (one bag per order) until she has received at least one bag of each flavor. Define the random variable to be the total number of orders that Tiffany places. Determine the expected value .

  • Use Linearity of Expectation. If Tiffany has received exactly i different flavors of chips, how many orders does she expect to place until she receives a new brand?)

Could someone walk me through the logic in approaching this problem, not even sure where to start. This is for a math course. Thanks in advance.

3

There are 3 best solutions below

1
On

Let the random variable $X_i$ be defined as the number of orders Tiffany has to place, when having seen exactly $i$ distinct brands so far, to get a bag from a new brand; and let $$ X \stackrel{\rm def}{=} X_0+\dots+X_{n-1}. $$ The problem asks to compute $\mathbb{E}X$; by linearity of expectation, this is equal to $\sum_{i=0}^{n-1}\mathbb{E}X_i$.

Now. each $X_i$ follows a geometric distribution with parameter $p_i$, where $p_i=1-\frac{i}{n}$. Indeed, having seen $i$ brands so far, there are exactly $n-i$ unseen ones, and thus a probability $\frac{n-i}{n}$ that an order will hit one of those (as the choice is made uniformly at random, independently from the previous orders). Therefore, $$ \mathbb{E}X_i = \frac{1}{p_i} = \frac{n}{n-i} = \frac{1}{1-\frac{i}{n}} $$ and $$ \mathbb{E}X = \sum_{i=0}^{n-1} \frac{1}{p_i} = \sum_{i=0}^{n-1} \frac{1}{1-\frac{i}{n}}. $$

0
On

For fixed $i<n$, let $A_k$ for $k=1,2,3,...$ denote the event that the first time she gets a new flavour is after the $k^{th}$ new purchase. Let $X$ denote the random variable representing the number of new bags she must buy until she receives a new flavour. Then

$P(A_k) = (\frac{i}{n})^{k-1}(1-\frac{i}{n})$.

So $$E(X) = E(\sum_{k=1}^{\infty} X \cdot 1_{A_k}) = \sum_{k=1}^{\infty}E(X \cdot 1_{A_k}) = \sum_{k=1}^{\infty} k \cdot P(A_k) =\sum_{k=1}^{\infty} k \cdot (\frac{i}{n})^{k-1}(1-\frac{i}{n})$$

where I have used linearity of expectation in the second equality, and where $1_A$ denotes the indicator on any event $A$. It remains to perform this standard sum.

0
On

Let $X_i$ be the random variable counting the number of orders until a flavour different from the $i-1$ flavours already gotten is received. These $X_i$ are geometric random variables (number of indpendent trials until success) with expectations $1/p_i$ where $p_i$ is the probability of receiving a new flavour. The number of orders to obtain all the flavours is $X_1+\cdots+X_n$. So the linearity of the expectation implies that the expected number of flavours is $E(X_1)+...+E(X_n)$. For instance, if $n=10$, $E(X_1)=1$, $E(X_2)=10/9$, $E(X_3)=10/8$ and so on.