Using a markov chain to calculate the expected value of conditional/constrained choices (TopCoder PancakeStack)

176 Views Asked by At

I've been working on a programming challenge (link) where an expected value is calculated.

Recently I learned about Markov chains and successfully applied them to solving a set of problems, but the problems did not involve constraints as in this one.

Because I'm quite new to the field, can anyone give me hints about terminology I should search, or how to set up the problem to solve it via markov chain application? Is this problem something where Markov chains are applicable?

1

There are 1 best solutions below

6
On BEST ANSWER

This problem is not suited to Markov chains, IMO. It is more suited to combinatorial probability (i.e. counting the ways that specified outcomes (events) can happen).

For $i=0,1,\ldots,N-1$, define:

\begin{eqnarray*} \text{Event: }\quad S_i &=& \text{Pancake $i$ is on the stack} \\ && \\ \text{R.V.: }\quad X\; &=& \text{Total deliciousness of the stack} \\ \text{R.V.: }\quad X_i &=& \text{Total deliciousness of the stack due to Pancake $i$.} \\ \end{eqnarray*}

So we have $X = \sum_{i=0}^{N-1}X_i$. Note that $X_i=0$ if Pancake $i$ is not on the stack. So we require:

\begin{eqnarray*} E(X) &=& E\left(\sum_{i=0}^{N-1} X_i\right) \\ &=& \sum_{i=0}^{N-1} E(X_i) \\ &=& \sum_{i=0}^{N-1} d_i P(S_i) \qquad\qquad\text{where $d_i$ is the deliciousness of Pancake $i$.} \\ \end{eqnarray*}

Probability $P(S_i)$ depends on when Pancake $i$ was selected. If it was the $j^{th}$ one selected, then for it to go on the stack all previous $j-1$ selections must be larger (i.e. Pancakes $i+1$ to $N-1$) and must have been selected in descending order. There are $\binom{N-i-1}{j-1}$ ways to pick $j-1$ from those $N-i-1$ pancakes (in the required descending order). The total number of permutations of $j-1$ pancakes, given that the $j^{th}$ one is Pancake $i$ and is therefore excluded, is $(N-1)!/(N-j)!$ since we have $N-1$ to choose from. Thus,

$$P(S_i\mid \text{Pancake $i$ is the $j^{th}$ selection}) = \binom{N-i-1}{j-1} \bigg/ \dfrac{(N-1)!}{(N-j)!}.$$

So we continue, using the law of total probability:

\begin{eqnarray*} E(X) &=& \sum_{i=0}^{N-1} d_i \sum_{j=1}^{N}P(S_i\mid \text{Pancake $i$ is $j^{th}$ selection})P(\text{Pancake $i$ is the $j^{th}$ selection}) \\ &=& \sum_{i=0}^{N-1} d_i \sum_{j=1}^{N-i} \dfrac{1}{N} \binom{N-i-1}{j-1} \bigg/ \dfrac{(N-1)!}{(N-j)!} \\ && \qquad\qquad\text{Upper limit for $j$ is $N-i$: for higher $j$, event $S_i$ is impossible} \\ &=& \sum_{i=0}^{N-1} d_i \sum_{j=1}^{N-i} \dfrac{(N-j)!}{N!} \binom{N-i-1}{j-1}. \end{eqnarray*}