conditional probability and entropy

51 Views Asked by At

There is a cafe that offers only coffee (C) and tea (T). However, the size of the cup is large (L), medium (M) and small (S). Let a random variable for type of drink {C,T } be X, and a random variable for a value of cup size {L, M, S} be Y.

We know the following.

Joint distribution : $ Pr ({X= C, Y = L}) = 0.3 $

That is, the probability for the visitor to order a large cup of coffee is 0.3, which is a higher probability than any other combination of drink and cup.

$Pr ({X = T, Y = L}) = 0.1,$ $ Pr ({X = C, Y = M}) = Pr ({X = T, Y = S})$

. $Entropy: H (X) = 1.0 bit,$ $ H (X | Y = S) = 0.469 bit. $

At this time, find $H (Y),$ $ H (X | Y), H (Y | X), H (X, Y), I (X; Y). $

I try to make the joint distribution table
\begin{array}{rc} & Y_{\text{size}} \\ X_{\text{type}} & \begin{aligned} \ P (X,Y )&& Y=\text{L} && Y=\text{M} && Y=\text{S} \\X=\text{coffee} &&\frac{3}{10} && \text{x} && \text{} && \text{0.5} \\ X=\text{Tea} &&\frac{1}{10}&&\frac{}{} && \text{x} && \text{0.5} \end{aligned} \end{array}

i see from the entropy table that $ H (X) = 1.0 bit$, means p=0.5 and 1-p =0.5 because $H(X)=plnp + (1-p)ln (1-p) =1$ but im stuck at ,$H (X|Y=s)=0.469$ bit from the table $H (X|Y=s)=0.469 = p lnp +(1-p)lnp$ will give $p = 0.1$ or $0.9$ but if i put it in table , p(x,s) it wont match, how can i find the probability $P(X=c,Y=s)$ hopefully someone could enlight me thank and also log is in based 2