Determining mean and variance for number of independent trials

265 Views Asked by At

enter image description here

This is the question. I tried doing it by arranging mean of N_k as E(N_k) = E(k) + E(N_k-k), since N_k = k + N_k-k. Then, we know that E(k) = kp, and E(N_k-k) = (1-p)(N_k-k). Hence, putting the values in the equation, we get E(N_k) = kp + (1-p)(N_k-k). For Var(N_k) = E[(N_k)^2] – (E(N_k))^2 is the formula, but I don’t know whether I should calculate and check since Var should be 0 as they are independent.

My prof told me its incorrect, he said that, "In 14 none of the answers is correct, sorry. Even more, the answer that you obtained for the variance is a random variable, which is wrong: It should be a deterministic number depending on k and p." I can see that the value of mean and variance will be in terms of k and p, but I don't know how I am supposed to express N_k in terms of k and p. Need some help here guys. Any help will be greatly appreciated.

1

There are 1 best solutions below

0
On

$N_k$ is a discrete random variable, hence we need to find probability of events such as $\{N_k = m\}$. Take any $m \ge k$, $m \in \mathbb N$.

We have $\mathbb P(N_k=m) = { m-1 \choose k-1} p^{k}(1-p)^{m-k}$, hence we need to choose "times" when $k-1$ succesess occur in $m-1$ first trials, assign them probability $p^{k-1}$, in the remaining $m-1 - (k-1) = m-k$ places we need to have failures, so assign probability $(1-p)^{m-k}$ and finnaly on last slot (since we need to finish with a success) we assign $p$, so that the formula follows.

We have $\mathbb E[N_k] = \sum_{m=k}^\infty m {m-1 \choose k-1} p^k (1-p)^{m-k} = \frac{kp^k}{(1-p)^k}\sum_{m=k}^\infty{m \choose k} (1-p)^{m}$

Seems like a challenging problem to calculate this sum. We'll do it, however in a different manner.

Note that $N_k = X_1 + X_2 + ... + X_k$ where $X_j$ is number of trials we need to make between $j-1$ and $j'$th success, $j \in \{1,...,n\}$

Every $X_j$ follows a geometric distribution $Geo(p)$ that is $\mathbb P(X_j = m) = p(1-p)^{m-1}$ for $m \in \mathbb N_+$

Note that we have $\mathbb E[N_k] = \sum_{j=1}^k \mathbb E[X_j] = k\mathbb E[X_1] = \frac{k}{p}$

So that $\frac{k}{p} =\frac{kp^k}{(1-p)^k}\sum_{m=k}^\infty{m \choose k} (1-p)^{m}$

And we calculated ! $\sum_{m=k}^\infty{m \choose k}x^{m} = \frac{x^m}{(1-x)^{m+1}}$ for $x \in [0,1)$ at least.

To calculate variance we need $Var(N_k) = \sum_{j=1}^k Var(X_j) = k Var(X_1) = \frac{k(1-p)}{p^2}$ since variables $X_j$ are independent and follow the same distribution.

I assumed you know that for Geometric random variable we have mean $\frac{1}{p}$ and variance $\frac{1-p}{p^2}$