How to evaluate the change of information of a random variable?

66 Views Asked by At

Given a random variable $X$ having finite alphabet $\mathcal{A}_X$ and valid $p_x(\cdot)$ (for which there is no $x_0 \in \mathcal{A}_X$ so that $p_x(x_0) = 0$) I want to know its actual outcome (speaking about sport, $X$ could be the team that would buy a really famous player: the experts would give me $\mathcal{A}_X$ and $p_x(\cdot)$).

I'm willing to pay a $V$ amount of money for this information and I've found someone who could help me. Before the closing of the deal I find out that the outcome of $X$ won't surely be, let's say, $x_0$.

Since prior this discovery the value of $X$ was $V$, what is the value $V^\star$ I should pay for the information?

How I would do - I would evaluate $H(X)$ using the definition, then I'd declare a new random variable $Y$ so that $\mathcal{A}_Y = \mathcal{A}_X \,/\, x_0$ and $p_Y(y) = \frac{p_x(y)}{1 - p_x(x_0)}$ and finally I'd evaluate $H(Y)$. Now I guess that $V^\star = \frac{H(Y)}{H(X)}V$.

I think this approach could be valid but I am not sure about that since it would be more natural to me to evaluate something like $H(X \,|\, X \neq x_0)$. Also, I'm not sure that a proportion is the way to go for finding $V^\star$.

1

There are 1 best solutions below

0
On BEST ANSWER

Let's investigate if entropy can be a reasonable measure for this usage. Consider a rv $X$ whose alphabet is $\mathcal{A}_X = \{ \alpha, \beta, \gamma \}$ and $p_X(\alpha) = 0.9$, $p_X(\beta) = 0.05$ and $p_X(\gamma) = 0.05$. The entropy of $X$ defined as $$H(X) = -\sum_{x \in \mathcal{A}_X} p_X(x) \log_2(p_X(x))$$ gives $H(X) \simeq 0.57$. For example, let's say we're sure $\alpha$ won't be the outcome of $X$, we define $Y$ as stated in the question and evaluate its entropy obtaining $H(Y) = 1$. Considering another situation, this time $\gamma$ won't be the outcome of $X$ and if we define $Z$ as said above we get $H(Z) \simeq 0.3$.

In the initial scenario $\alpha$ is the most likely outcome, almost certain. If we remove that outcome we're left with no more than a coin toss, the worst case scenario: our knowledge about $X$ is actually lower than before (we thought we were almost certain about the outcome - not entirely sure) and now we're left with a 50-50 decision. Knowing the exact outcome now is more valuable than knowing it before - this is represented by the fact that $H(Y) > H(X)$.

On the other hand, removing $\gamma$ from the possible outcomes increases the probability that $\alpha$ will be the real outcome, lowering the uncertainty of the scenario and the value of the information I wanted to buy - confirmed by $H(Z) < H(X)$.

This example seems to agree that entropy is a reasonable measure for this usage.