This puzzle is a generalization of this one. We have $3c$ coins, all but $2$ of which are standard coins that weigh the same. We also have one coin heavier than all the others, and one coin lighter than all the others. We have three bags, colored red, green, and blue, each containing $c$ coins. It is known that the heavy and light coins are in different bags. What is the minimum number of weighings needed to locate and identify the bad coins, using only a two-pan balance?
I haven't been able to find the full solution. Thanks to a hint from user antkam, I can show that if $c=3^n+3^{n-1}$ then $2n+3$ weighings are necessary and sufficient. Necessity is easy. There are $6$ possibilities for which bags the bad coins are in, and then $c$ possibilities for the identity of each. Since there are at most $3$ outcome of a weighing (left pan lighter than, heavier than, or same as right pan ) at least $\left\lceil\log_36c^2\right\rceil$ weighings are required. When $c=3^n+3^{n-1}=4\cdot3^{n-1}$, this gives $$ \left\lceil\log_36+\log_316+2(n-1)\right\rceil=\lceil\log_396+2n-2\rceil=2n+3 $$ For sufficiency, we use antkam's algorithm. Split each bag into a pile of $3^n$ coins and a pile of $3^{n-1}$ coins. We start by weighing the large piles against one another. If at least one of the bad coins is present is the large piles, it will take $3$ weighings to classify the piles, but if all the coins are good, it will only take $2$ weighings.
It is well known, and easy to prove, that if there is one bad coin, and we know whether it is heavy or light, then we can locate it in $n$ weighings. If there are two bad coins in the big piles, it takes $3$ weighings to classify the coins, and another $2n$ to locate them.
If there is one bad coin, say the light in the large red pile, it takes $3$ weighings to classify the large piles a $n$ to locate the light coin. We now know that the heavy coin is in either the small green pile or the small blue pile, and it takes one weighing to find out which pile it's in. Then it takes $n-1$ weighings to identify the coin.
If there is no bad coin in the large pile, we find out in $2$ weighings, and make another $3$ weighings to classify the small piles. Then it takes $2(n-1)$ weighings to locate the bad coins. In each case, we have identified both bad coins in $2n+3$ weighings.
I haven't been able to get any further. When $c=3^n$, we find that $\lceil2n+\log_36\rceil=2n+2$ weighing are necessary, but I haven't been able to decide if this is sufficient. Similarly, if $c>\frac{3^{n+1}}{\sqrt2}$, then at least $2n+4$ weighings are necessary, but I don't know when they are sufficient.
Unless it turns out that the simple lower bounds for necessity also give sufficiency, necessity will be hard to nail down, because of the difficulty of establishing that one has considered every possible algorithm. The algorithm only really applies when $c\geq4$. When $c=1$, $3$ weighings are necessary and sufficient. When $c=2$, it can be done in $4$ weighings, and I think I have proved it cannot be done in $3$. When $c=3$, I know at least $4$ weighings are required, and I can do it in $5$. Can it be done in $4$? I don't know.
The problem can be succinctly stated as, for a given value of $n$, what is the largest value of $c$ so that the the bad coins can be identified in at most $n$ weighings?
Any insights?
I should add that nothing is known about the combined weight of the bad coins. They might weight less than, more than, or the same as two good coins.
EDIT
I've done some more analysis, but I don't know if it leads anywhere. Here it is, for what it's worth:
I'm proceeding on the assumption that the first trial should be to weigh $m$ coins of one color against $m$ of some other color, for some $0<m<c$.
Suppose the first weighing tests $m$ red coins against $m$ green ones. Call the set of weighed red coins $R_1$ the set of unweighed red coins $R_2$ and define $G_1,G_2$ similarly.
If the pans balance, there are still $6$ possibilities for which bag is which:
- $R_2$ light, blue heavy: $(c-m)c$ ways
- $R_2$ heavy, blue light: $(c-m)c$ ways
- $G_2$ light, blue heavy: $(c-m)c$ ways
- $G_2$ heavy, blue light: $(c-m)c$ ways
- $R_2$ light, $G_2$ heavy: $m^2$ ways
- $R_2$ heavy, $G_2$ light: $m^2$ ways
giving $$ P_1(m):=4c^2-4mc+2m^2 $$
possibilities in all.
If the pans do not balance, we may assume $R_1<G_1$. Then we know that either $R_1$ is light or $G_1$ is heavy. If $R_1$ is light, there are $2c$ possibilities for the heavy coin, so $2mc$ ways, and the same number if $G_1$ is heavy. We have counted the case where both $R_1$ is light and $G_1$ is heavy twice, so there are $$ P_2(m):=4cm-m^2 $$ possibilities in this case.
Differentiating, we see that $P_1$ decreases and $P_2$ increases, so that $\min\max\{P_1(m),P_2(m)\}$ occurs when $P_1(m)=P_2(m)$. When $m=\frac{2c}{3}$ $$ P_1(m)=P_2(m)=\frac{20c^2}9 $$ If we are to complete the process in $n-1$ more weighings, we need $$ 3^{n-1}\geq\frac{20c^2}9\implies c\leq\sqrt{\frac{3^{n+1}}{20}} $$
Of course, it may not be true that minimizing the worst-case number of possibilities after the first trial actually minimizes the number of trials necessary in the worst case. I recall that in Don Knuth's solution of the game mastermind, he says that he started with that assumption, and was surprised when it turned out to be wrong.
When $c=12$, antkam's algorithm optimally finds the bad coins in $7$ weighings, by first weighing $9$ red coins against $9$ green ones, whereas the heuristic suggests weighing $8$ against $8$. If we we to do this, we could not then follow the general plan of antkam's algorithm, because a light bag with $8$ coins still requires $3$ weighings, but a light bag with $4$ coins requires $2$ weighings instead of $1$.
So, if this heuristic has any value, I think it must be in combination with a more complicated algorithm, probably one where the second trial depends on the results of the first weighing; so far, I haven't been able to design such an algorithm.