I am not getting what H(X) denotes over here?
This is the entropy of $X$, It suffices to apply the definition: if $X$ has finitely many values with probabilities $p_1, \ldots p_k$,
$$H(X)=\sum_{i=1}^k -p_i \log(p_i)$$
where usually the $2$-$\log$ is used.
The first $X$ is defined via $X_1$ and $X_2$.
So what is $P(X=1)$? There are two cases, so we apply the law of total probability.
$$P(X=1)=P(X_1=1)\frac{8}{10} + P(X_2=1)\frac{2}{10} = \frac{2}{10}\frac{8}{10} + \frac{7}{10}\frac{2}{10} = \frac{3}{10}$$
The value $2$ can only be achieved via the first case of the definition of $X$, so $$P(X=2) = \frac{8}{10}P(X_1=2)= \frac{8}{10}\frac{4}{10}=\frac{32}{100}$$
Likewise for value $3$:
$$P(X=3) = \frac{8}{10}P(X_1=3)=\frac{8}{10} \frac{4}{10}=\frac{32}{100}$$
Value $4$ is also possible, only via $X_2$:
$$P(X=4) = \frac{2}{10}P(X_2=4)=\frac{2}{10}\frac{3}{10}=\frac{6}{100}$$
As $30+32+32 + 6=100$ we have total sum of probabilities $1$, as it ought to be,
so the entropy is
$$H(X)=-\left( 0.3\log(0.3) + 0.32\log(0.32) + 0.32\log(0.32) + 0.06\log(0.06) \right)$$
Compute the second one the same way: first all separate probabilities for all options and then apply the formula.
Copyright © 2021 JogjaFile Inc.
This is the entropy of $X$, It suffices to apply the definition: if $X$ has finitely many values with probabilities $p_1, \ldots p_k$,
$$H(X)=\sum_{i=1}^k -p_i \log(p_i)$$
where usually the $2$-$\log$ is used.
The first $X$ is defined via $X_1$ and $X_2$.
So what is $P(X=1)$? There are two cases, so we apply the law of total probability.
$$P(X=1)=P(X_1=1)\frac{8}{10} + P(X_2=1)\frac{2}{10} = \frac{2}{10}\frac{8}{10} + \frac{7}{10}\frac{2}{10} = \frac{3}{10}$$
The value $2$ can only be achieved via the first case of the definition of $X$, so $$P(X=2) = \frac{8}{10}P(X_1=2)= \frac{8}{10}\frac{4}{10}=\frac{32}{100}$$
Likewise for value $3$:
$$P(X=3) = \frac{8}{10}P(X_1=3)=\frac{8}{10} \frac{4}{10}=\frac{32}{100}$$
Value $4$ is also possible, only via $X_2$:
$$P(X=4) = \frac{2}{10}P(X_2=4)=\frac{2}{10}\frac{3}{10}=\frac{6}{100}$$
As $30+32+32 + 6=100$ we have total sum of probabilities $1$, as it ought to be,
so the entropy is
$$H(X)=-\left( 0.3\log(0.3) + 0.32\log(0.32) + 0.32\log(0.32) + 0.06\log(0.06) \right)$$
Compute the second one the same way: first all separate probabilities for all options and then apply the formula.