The following question may match with another question of mine but kindly don't tag that question since the final question is different. Only the rules of the game is same.
Given a linear path with $n$ dots connected by $n-1$ lines, find the minimum number of coins required to win the game provided the game features are as follows:
- In the following game, you're given a linear path as mentioned above and a set of coins will be placed randomly on the dots by your friend. (You don't know the exact location of the dots as that's not controlled by you).
- Aim of the game: You win if you can bring at least one coin to the destinated dot (as set by your friend).
- Whenever you move a coin from a dot to its adjacent dot, you take off an existing coin in the current dot permanently (as a fee). If there's no coin to remove (as a fee), you can't move a coin from one dot to its adjoining dot.
I noted that if there are $n$ dots and $n-1$ straight lines joining them, then the minimum number of coins for which you can force/guarantee a win is $2^{n-1}$. It can be easily proved by doing a bit of a backward approach and then considering the extreme case.
Now, the question is, if the structure of the graph is not a linear path/list structure but a node with $n$ children, such that each child has a set of $n$ children again and this continues upto $n-1$ levels. Basically, this also means that each child is having $n-1$ siblings (excluding it).
For a brief example of the structure, let me show for $n=3$. The top node be $X$. Now, $X$ has $3$ children namely $A_1, B_1, C_1$. Now, $A_1$ has $3$ children which are $A_2, A_2', A_2''$ and similarly define for $B_1, C_1$. Now, $A_2$ has $3$ children namely $A_3, A_3', A_3''$ and similarly define for $A_2', A_2''$ and also for the children of $B_i\text{'s}$ and $C_i\text{'s}$. And that's the end of the total structure for $n=3$.
Incomplete Proof
I believe I have the solution to your coin game problem for general trees, so I'll put it as an answer under this question rather than your previous. My proof is incomplete, but I'm fairly confident in the idea.
Let $T$ be a tree and $v \in V(T)$ be the destined vertex of $T$. We wish to determine the minimum number of coins required to guarantee a win. We do this in two parts: first show a configuration of just under the minimum number of coins that does not permit a win, and second show that any configuration using the minimum number of coins always guarantees a win.
I've shifted the problem from thinking about the number of coins to a weight associated to the distribution of the coins. I describe the maximum weight coin distribution I think an unwinnable tree can have, and it seems like any coin distribution with higher weight must permit a win.
The Weighting: First let's assign a weight to each vertex. Create the breadth-first search tree of $T$ with root $v$, having levels $A_0 = \{v\}$, $A_1$, etc. Weight the vertices in level $A_i$ with $2^{-i}$. The weight of a coin distribution is the sum over all vertices the quantity $c_i \cdot w_i$, where $c_i$ is the number of coins at vertex $i$ and $w_i$ is the weight of vertex $i$.
The virtue of this weighting is that a move "towards $v$" conserves the weight. That is, under a move "towards $v$", two coins in level $A_{i+1}$ (which have weight $2 \cdot 2^{-(i+1)} = 2^{-i}$) are transformed to one coin level $A_i$ (which has weight $2^{-i}$).
It will also be convenient to think of moves as running in reverse, that is, a coin in level $A_i$ becoming two coins in level $A_{i+1}$.
Intuitively, in a tree, it never makes sense to make a forwards move away from $v$, that is, turn two coins in level $A_i$ into one coin in level $A_{i+1}$. So let's not do that (maybe trying to rigorous argue why not later).
Also intuitively, a strategy for winning on this rooted tree is to make as many moves towards $v$ as you can, and hope that you can win. In the way I'm thinking about it, that's what makes this approach not easily extensible to general graphs. In some cases, for general graphs, it is helpful to move away from the destined vertex in order to give yourself more of a push towards the destined vertex elsewhere (i.e. moving laterally or backwards within the breadth-first search tree structure).
I'm thinking that in the worst case scenario with the above strategy, a single coin would be in every vertex of $T$ except $v$. This way, adding a coin anywhere would allow you to win. This coin configuration gives some calculable weight $W$. The claim is that $W$ is the maximum weight of an unwinnable coin arrangement. Given this arrangement, perform backwards moves pushing the coins away from $v$ in order to maximize the number of coins. To do this, whenever faced with the choice of sending the coins on vertex $u$ down to it's children $u_1, \dots u_r$, choose the child whose subtree has the largest depth. This should maximize the number of coins in an unwinnable state.
This is all hand wavy, but I think we can prove what I'm claiming. The proof would go something like:
I think all these things are true, but I'll have to think more about their proofs.