Learning about Entropy, notation question: is $P(X)$ a random variable?

37 Views Asked by At

Background

I'm taking a first year probability course and I've gotten reasonably comfortable calculating $E[g(X)]$ using LOTUS. Now I'm learning about Entropy and for the first time I'm encountering a situation where $P(X)$ is in $g(X)$.

Entropy

From Wikipedia:

$$H(X) = E[I(X)] = E[-\log(P(X))]$$

Question 1: Is it correct to think of $P(X)$ as a function of $X$, and is therefore a RV itself? I'm wondering how to formally define $P(X)$... my attempt:

$$ P(X) = g(X) = \left\{ \begin{array}{11} p_1 & \text{with } Pr(P(X) = p_1) = p_1\\ p_2 & \text{with } Pr(P(X) = p_2) = p_2\\ \vdots \\ p_n & \text{with } Pr(P(X) = p_n) = p_n\\ \end{array} \right. $$

Like I said I'm used to evaluating $E[g(X)]$ but it's just weirding me out that $g(X)$ is the pmf of $X$.

Question 2: If $P(X)$ is a RV... I'm wondering if the value of $E[P(X)] = \sum p(x)p(x)$ ever comes up or is studied. I did some searches for it and outside of hmwk exercises it doesn't seem that interesting.

Thanks for your patience and help.