Let $P$ be a set of probabilities s.t. $\sum_{p_i \in P} p_i = 1$.
Moreover, let $H(P)$ the Shannon's entropy of the set of probabilities $P$: $$ H(P) = -\sum_{p_i \in P} p_i \log_2 p_i $$
I define a set of actions $a_1, \ldots, a_N$ which modify the set of probabilities $P$.
Performing an action. When $a_i$ is asked:
- Some of the probabilities are removed from $P$
- $P$ is normalized so that its summation is still 1
In this context, I define $H_{i}(P)$ as the Shannon's entropy of the new set $P$.
Performing two actions. When $a_j$ is asked after $a_i$:
- Some of the probabilities are removed from $P$ as a consequence of $a_i$
- $P$ is normalized so that its summation is still 1
- Some of the probabilities are removed from the new $P$ as a consequence of $a_j$
- $P$ is normalized again
The final Shannon's entropy is $H_{j}(P | a_i)$.
Proof? Now, suppose I choose two actions $a_1$ and $a_2$ such that: $$ H_1(P) \leq H_2(P). $$ How to prove whether the following is true (or, if it is not, whether there is a subset of cases in which it is true)? $$ \forall a_i, i \neq 1,2: H_i(P|a_1) \leq H_i(P|a_2) $$
Let $P = [(1-\epsilon)/2,(1-\epsilon)/2,\epsilon/2,\epsilon/2]$ for a small $\epsilon$. Choose $a_1$ to remove one of the $(1-\epsilon)/2$s, and $a_2$ to remove one of the $\epsilon/2$s. After $a_1$, we are left with the set of probabilities $$P_1 = \left[\frac{1-\epsilon}{2},\frac{\epsilon}{2},\frac{\epsilon}{2}\right]\frac{2}{1+\epsilon},$$ and after $a_2$, we are left with $$P_2 = \left[\frac{1-\epsilon}{2},\frac{1-\epsilon}{2},\frac{\epsilon}{2}\right]\frac{2}{2-\epsilon}.$$ Clearly, $H_1(P) \leq H_2(P)$. Now, choose $a_3$ to remove the first probabilities in $P_1$ and $P_2$ (you may even force them to have the same "initial" index), you will get $H_3(P|a_1) \geq H_3(P|a_2)$, and we have counterexample to your assertion. I do not think you will be able to get a simple expression as to when your assertion holds.