I will roll a dice, and ask you, what is your probability of guessing the roll at random. The answer is 1/6, obviously. Then I will look at the result, and tell you, if the answer is 5 or not. What is the probability that you will guess the number right afterwards?
In Bayesian, you can encode the extra info into the prior, saying that the prior p = 1/5 if result is not 5, and 0 otherwise. Then the posterior being the normalized product between normal dice roll and the prior will be the same as the prior, which makes sense, you should have 1/5 chance to guess the dice correctly, if, for example, you know it is not 5.
Are there other ways of solving this problem. In particular, I am looking for a formal frequentist solution.
The main idea here is that we had $6$ possibilities all with equal weight, but one was eliminated, so now we simply have $5$ possibilities all with equal weight.
Generalizing this idea, let there be a set of $n$ probabilities $P_i$ with $$\sum_{i=1}^n P_i=1$$ Now let's subtract the final probability $P_n$, thus eliminating it from the list, we get $$\sum_{i=0}^{n-1}P_i=1-P_n$$ In order to get the new probabilities $Q_i$, the new probilities must still sum to one. We must therefore divide each side of the equation by $1-P_n$ as follows. $$\sum_{i=0}^{n-1}\frac{P_i}{1-P_n}=1$$ Thus each new probability after eliminating $P_n$ is $$Q_i=\frac{P_i}{1-P_n}$$