Consider this game: Alice secretly rolls one dice once or twice and tells Bob the dice sum. Bob has to guess, based solely on the sum, whether the dice has been rolled once or twice.
We can easily calculate the probability that Bob's guess is right: As a simple example, let's assume that the number of dice rolls $N$ is distributed 50/50 among 1 and 2. The best Bob can do when told a dice sum $S$ of $s$ is to bet on $$\mathrm{arg}\max_n P(N=n| S=s),$$ which can be calculated using Bayes' Law.
The overall probability that Bob's guess is wrong is $$\omega = \sum_{s=1}^{12} (1- \max_n P(N=n| S=s))\cdot P(S=s),$$ which is $\approx 20.8\,\%$ here.
If now Alice is allowed to build her own unfair dice, i.e., choose the probabilities ($P(X=x) := P(S=s|N=1)$) for each dice value, how would she build the dice so that Bob's chances of winning are minimized?
Obviously, this is an optimization problem where we can choose $P(X=x)$ to be any PMF and the objective function to maximize is $\omega$. (See Edit below and Comments) In order to tackle the problem, I got rid of the max operator in $\omega$ by instead maximizing the entropy of $P(N=n| S=s))$. Also, $P(S=s|N=n)$ can be calculated by convolution as $P(S=s|N=n) = P(X=x)^{*n}(s)$ with $^{*n}$ denoting the n-th convolution power.
I can formulate a 6-dimensional optimization problem and numerically solve this (for those interested: $P(X=1..6) \approx 0.35, 0.23, 0.16, 0.12, 0.07, 0.05$), however, if we go from a 6 sided dice to a continuous probability distribution, I cannot solve this numerically anymore. Is there any non-numeric way to solve this?
Edit: The numbers above ($P(X=1..6)$) are not correct as pointed out in the comments. The wrong numbers here were caused by the step from maximizing $\omega$ to maximizing the entropy of $P(N=n| S=s))$, which was not correct. Please see the comments.