Dependence of 'total number of heads' and 'total number of tails' in n coin toss

1k Views Asked by At

Suppose we have a unbiased coin and in each phase of the experiment we are tossing that coin $n$ times. Outcome of each toss is independent of other toss and probability values for heads and tails in each toss remain same. Hence probability of getting total k times heads (or tails) will be determined by binomial probability mass function, $b(n,k,p)$. Now suppose we define two random variables in this manner.

$S_h=total\ number\ of\ heads\ in\ n\ coin\ toss\ expt$

$S_t=total\ number\ of\ tails\ in\ n\ coin\ toss\ expt$

Now our questions are,

(i) Are those two random variables $S_h and\ S_t$ are dependent random variables?

(ii) If yes then how to prove it rigorously?

(iii) What will be covariance between to RVs?

THANKS, in advance !!!!

Reply: @anomaly

I think those will be dependent random variables. Let us assume $n=3$. Hence,

$\Omega=\{HHH,HHT,HTH,HTT,TTH,TTT,THT,THH\}$.

Now if events $(S_h=2)$ and $(S_t=1)$ are independent of each other then,

$P(S_h=2\ and\ S_t=1 )=P(S_h=2)*P(S_t=1)$

But from sample space we get,

$P(S_h=2)=\frac{3}{8}$

$P(S_t=1)=\frac{3}{8}$

$P(S_h=2 \ and \ S_t=1)=\frac{3}{8}$

Hence those two events can't be independent of each other. Since independence of random variables means independence of all possible events for those RV's hence those RVs are dependent.

3

There are 3 best solutions below

0
On BEST ANSWER

Hint: $S_h = n - S_t$. Covariance is linear in each variable.

0
On

Expectation of $S_h S_t$ is,

$E(S_hS_t)=\sum_{S_h=0}^{n} (n-S_h)S_h \ C_{n}^{S_h} \ p^{S_h} (1-p)^{n-S_h}$

1
On

$$E(S_hS_t)=\frac1n\sum_{k=0}^{n} (n-k)k\frac{n!}{k!(n-k)!} \ p^k (1-p)^{n-k}\\ =\frac1n\sum_{k=0}^{n} \frac{n!}{(k-1)!(n-k-1)!} \ p^k (1-p)^{n-k}\\ =(n-1)p(1-p)\sum_{k=1}^{n-1} \frac{(n-2)!}{(k-1)!(n-k-1)!} \ p^k (1-p)^{n-2-k}\\=(n-1)p(1-p).$$