Say we are given a biased die that shows up two of its sides (maybe $1$ and $2$) with a probability of $0.8$ and the rest of the sides (maybe $3,4,5,$ and $6$) with a probability of $0.2$.In order to determine the favoured sides, we will repeatedly throw the dice and create a frequency histogram.
In an ideal case, say we throw the dice a 1000 times, then we will expect the favoured sides to show up a significantly larger number of times than others. Specifically, in this particular example, sides $1$ and $2$ are expected to show up around $400$ times while the remaining sides are expected to show up around $50$ times each, and we can easily make a judgment.
We are tasked with determining the probability of making a false judgment - that is the possibility that we determine that sides, such as 1 and 3 are the favoured sides, instead of the true favoured sides 1 and 2 after we roll the die R number of times in total?
-- Adding some clarifications based on comments
We only know the count of favoured sides and not their identity. It may happen that sides $3$ and $4$ appear $450$ and $350$ times, respectively; even though 1 and 2 are favoured, they may appear only $45$ and $55$ times, respectively.
I was hoping to figure out the probability of this unlikely event and show that though it is positive, it will be insignificant when R is high. Further, maybe come up with some lower bounds on R.
Addendum-2 just added, at the end of this answer, to respond to the OP's (i.e. original poster's) comment/question.
See the just added Addendum, which adjusts my response based on the OP's (i.e. original poster's) editing of the original question.
I (somewhat) buried the response in the Addendum, rather than revising the entire answer, because the original work will be very valuable, in fine tuning the computations.
That is, it becomes easier to piggy back the final computations off of the original work.
If the following simplifying assumptions are off point, please advise. Then, I will probably have to delete this answer. If I understand correctly, this is a conditional Probability problem, rather than a Statistics problem.
Assume that you have thrown the die $1000$ times and it has come up:
Further assume that you know that the die is biased and that two of the sides will appear $40\%$ each, while the other $4$ sides will appear $5\%$ each.
Further assume that you know that one of the biased sides is the side where $1$ shows face up on the die.
Further assume that you kinow that the other biased side is either $2$ face up or $3$ face up, on the die.
So, you attach the probability of the data of $1000$ rolls occuring, given each of the $2$ hypotheses. Then, you have the relative probability of each of the hypotheses being true.
$\underline{\text{Case 1: the biased sides are 1 and 2}}$
Then, the probability of the experimental data occurring is
$$P_1 = \left[\binom{1000}{400} \times (.4)^{400}\right] \times \left[\binom{600}{400} \times (.4)^{400}\right]$$
$$ \times \left[\binom{200}{50} \times (.05)^{50}\right] \times \left[\binom{150}{50} \times (.05)^{50}\right]$$
$$ \times \left[\binom{100}{50} \times (.05)^{50}\right] \times \left[\binom{50}{50} \times (.05)^{50}\right].$$
$\underline{\text{Case 2: the biased sides are 1 and 3}}$
Then, the probability of the experimental data occurring is
$$P_2 = \left[\binom{1000}{400} \times (.4)^{400}\right] \times \left[\binom{600}{400} \times (.05)^{400}\right]$$
$$ \times \left[\binom{200}{50} \times (.4)^{50}\right] \times \left[\binom{150}{50} \times (.05)^{50}\right]$$
$$ \times \left[\binom{100}{50} \times (.05)^{50}\right] \times \left[\binom{50}{50} \times (.05)^{50}\right].$$
Then, the probability that Case 1 is accurate, rather than Case 2 is
$$\frac{P_1}{P_1 + P_2} = \frac{(.4)^{400} \times (.05)^{50}}{\left[(.4)^{400} \times (.05)^{50}\right] ~~+~~ \left[(.05)^{400} \times (.4)^{50}\right]}.$$
Addendum
Altering the problem.
Suppose that the original problem that is analyzed above is altered, in the following way:
You know that exactly $2$ of the $6$ sides are biased, but you don't know which of the $2$ sides are the ones that are biased. All that you have is the experimental data described at the start of this problem, where the numbers $(1)$ and $(2)$ each occurred $400$ times, while each of the other numbers occurred $50$ times each.
So, instead of two competing hypotheses, that the biased sides are either [1::3] or [1::2], you have $~\displaystyle \binom{6}{2} = 15~$ competing hypotheses, because that is how many combinations there are of $6$ items, selected $2$ at a time, without replacement.
So, refer to these hypotheses (i.e. events) as
$E_1, E_2, \cdots, E_{14}, E_{15}.$
Let $P_k$ denote the probability that the experimental data occurred, under the assumption that hypothesis $E_k$ is accurate, for $k \in \{1,2,\cdots,15\}.$
Let $W = P_1 + P_2 + \cdots + P_{14} + P_{15}.$
Then, the probability that hypothesis $E_k$ is accurate is
$$\frac{P_k}{W} \tag1 .$$
Therefore, the problem has been reduced to computing each of $P_1, P_2, \cdots, P_{15}.$
Note that it is not necessary to compute each of $P_1, P_2, \cdots, P_{15}$ exactly. Instead, given the computation in (1) above, all that is necessary is to compute the relative magnitudes of $P_1, P_2, \cdots, P_{15}$.
To formalize the computations, I will label the events (i.e. hypotheses) as follows:
$E_1~:~$ [1::2] are the biased sides.
$E_2~:~$ [1::3] are the biased sides.
$E_3~:~$ [1::4] are the biased sides.
$E_4~:~$ [1::5] are the biased sides.
$E_5~:~$ [1::6] are the biased sides.
$E_6~:~$ [2::3] are the biased sides.
$E_7~:~$ [2::4] are the biased sides.
$E_8~:~$ [2::5] are the biased sides.
$E_9~:~$ [2::6] are the biased sides.
$E_{10}~:~$ [3::4] are the biased sides.
$E_{11}~:~$ [3::5] are the biased sides.
$E_{12}~:~$ [3::6] are the biased sides.
$E_{13}~:~$ [4::5] are the biased sides.
$E_{14}~:~$ [4::6] are the biased sides.
$E_{15}~:~$ [5::6] are the biased sides.
Then, computing the relative magnitudes:
$\displaystyle P_1 ~:$
$\displaystyle \left[(.4)^{400}\right]^2 \times \left[(.05)^{50}\right]^4.$
Each of $P_2$ through $P_9 ~:$
$\left[(.4)^{400}\right] \times \left[(.05)^{400}\right] \times \left[(.4)^{50}\right] \times \left[(.05)^{50}\right]^3.$
Each of $P_{10}$ through $P_{15} ~:$
$\left[(.05)^{400}\right]^2 \times \left[(.4)^{50}\right]^2 \times \left[(.05)^{50}\right]^2.$
Addendum-2
Responding to the comment/question of the OP (i.e. original poster)
I will piggyback off of the Addendum, which itself piggybacks off of the initial answer.
For illustration, I will attack each of the $2$ hypothetical observed distributions separately.
In both cases, I will start with the presumptions that:
Assume that you have a distribution of
$\{1:a_1\}$
$\{2:a_2\}$
$\{3:a_3\}$
$\{4:a_4\}$
$\{5:a_5\}$
$\{6:a_6\}.$
This means that it is being assumed that you rolled the die $(a_1 + \cdots + a_6)$ times, and the number $k$ appeared $(a_k)$ times.
You have $15$ relative probabilities to compute:
$P_1, P_2, \cdots, P_{15}.$
Assume that $r \in \{1,2,\cdots,15\}$.
Assume that you are computing the relative
magnitude of $P_r$ which represents the probability
that hypothesis $E_r$ is accurate.
Assume that with respect to hypothesis $E_r:$
The probability of the die rolling a $k$ is
presumed to be $s_k$
where $s_1 + s_2 + \cdots s_6 = 1.$
Then, the relative magnitude of $P_r$ should be computed as
$$\text{Relative magnitude of} ~P_r = \prod_{i=1}^6 (s_i)^{a_i}. \tag2 $$
For illustration purposes, I will select an arbitrary $P_r$ to compute for Distribution-1, and an arbitrary $P_r$ to compute for Distribution-2.
$\underline{\text{Distribution-1} ~: ~\{1:450, 2:350, 3:45, 4:55, 5:50, 6:50\}}$
Here, you have that
$(a_1, a_2, \cdots, a_6) = (450,350,45,55,50,50).$
Suppose that I am computing the relative magnitude of $P_8$, for Distribution-1.
The assigned probabilities for $(s_1,\cdots,s_6)$
that represent hypothesis $E_8$ are
$(s_1,s_2,s_3,s_4,s_5,s_6) = (0.05, 0.4, 0.05, 0.05, 0.4, 0.05).$
Therefore, for Distribution-1, the relative magnitude of $P_8$ is computed as
$$\left[(0.05)^{450}\right] \times \left[(0.4)^{350}\right] \times \left[(0.05)^{45}\right] \times \left[(0.05)^{55}\right] \times \left[(0.4)^{50}\right] \times \left[(0.05)^{50}\right].$$
$\underline{\text{Distribution-2} ~: ~\{1:710, 2:80, 3:30, 4:60, 5:60, 6:60\}}$
Here, you have that
$(a_1, a_2, \cdots, a_6) = (710,80,30,60,60,60).$
Suppose that I am computing the relative magnitude of $P_{12}$, for Distribution-2.
The assigned probabilities for $(s_1,\cdots,s_6)$
that represent hypothesis $E_{12}$ are
$(s_1,s_2,s_3,s_4,s_5,s_6) = (0.05, 0.05, 0.4, 0.05, 0.05, 0.4).$
Therefore, for Distribution-2, the relative magnitude of $P_{12}$ is computed as
$$\left[(0.05)^{710}\right] \times \left[(0.05)^{80}\right] \times \left[(0.4)^{30}\right] \times \left[(0.05)^{60}\right] \times \left[(0.05)^{60}\right] \times \left[(0.4)^{60}\right].$$