Expectation, conditional expectation and maximum entropy distribution.

43 Views Asked by At

Suppose $X\sim f(\cdot;q)$ where $f(\cdot;q)$ is a probability density function belonging to some parametric family. The support of the pdf is $\Omega$. The maximum entropy density for $X$ over the support $\Omega$ is $f(\cdot;q)$ given we restrict our attention to a particular parametric family. Mathematically,

$$\max_{p} \sum_{x \in \Omega} f(x;q) \log f(x;p) = \sum_{x \in \Omega} f(x;q) \log f(x;q).$$

Suppose, we truncate the support to some non-empty $\Xi:= \{X>k\}$. It has already been shown in this post that,

$$\max_{p} \sum_{x \in \Xi} f(x;q) \log f(x;p) \implies \sum_{x \in \Xi} f(x;q) \leq \sum_{x \in \Xi} f(x;p^*) \tag{1} \label{eq1}.$$

Now, I am interested in situations where the inequality in \eqref{eq1} will hold strictly. It means,

$$\sum_{x \in \Xi} f(x;q) < \sum_{x \in \Xi} f(x;p^*) \tag{1} \label{eq2}.$$

I suspect that, $\mathbb{E}_{q}(X) \neq \mathbb{E}_{q}(X|X \in \Xi)$ will always imply the strict inequality in \eqref{eq2}. Or alternatively, whenever, $\mathbb{E}_{q}(X) = \mathbb{E}_{q}(X|X \in \Xi)$ it will imply that $\sum_{x \in \Xi} f(x;q) = \sum_{x \in \Xi} f(x;p^*)$. Is that right?