Suppose we have a unbiased coin and in each phase of the experiment we are tossing that coin $n$ times. Outcome of each toss is independent of other toss and probability values for heads and tails in each toss remain same. Hence probability of getting total k times heads (or tails) will be determined by binomial probability mass function, $b(n,k,p)$. Now suppose we define two random variables in this manner.
$S_h=total\ number\ of\ heads\ in\ n\ coin\ toss\ expt$
$S_t=total\ number\ of\ tails\ in\ n\ coin\ toss\ expt$
Now our questions are,
(i) Are those two random variables $S_h and\ S_t$ are dependent random variables?
(ii) If yes then how to prove it rigorously?
(iii) What will be covariance between to RVs?
THANKS, in advance !!!!
Reply: @anomaly
I think those will be dependent random variables. Let us assume $n=3$. Hence,
$\Omega=\{HHH,HHT,HTH,HTT,TTH,TTT,THT,THH\}$.
Now if events $(S_h=2)$ and $(S_t=1)$ are independent of each other then,
$P(S_h=2\ and\ S_t=1 )=P(S_h=2)*P(S_t=1)$
But from sample space we get,
$P(S_h=2)=\frac{3}{8}$
$P(S_t=1)=\frac{3}{8}$
$P(S_h=2 \ and \ S_t=1)=\frac{3}{8}$
Hence those two events can't be independent of each other. Since independence of random variables means independence of all possible events for those RV's hence those RVs are dependent.
Hint: $S_h = n - S_t$. Covariance is linear in each variable.