Game Theory (Nash Equilibrium) from a wireless relay network perspective.

95 Views Asked by At

I am new to game theory. I have a very simple question regarding a nash equilibrium point in a wireless relay network. Let suppose we have a relay network where source node S is communicating with destination node D via a relay node R. Let's say that the amount of signal power received at R from source is e.g. 10 unit. And the minimum amount of power required to decode this received signal is 4 units. So the node R will consume the 4 units to decode the signal and then utilize the remaining 6 units of power (via energy harvesting technique) to transmit the decoded signal to destination node D. In other words, assume

Received power ($PR$) = $10 \,\,units$

Threshold decoding power ($DP$) = $4\,\, units$

Harvested power/Tranmsit power ($TP$) = $6\,\, units$

Now the conditions are "Larger values of TP will result in better performance at destination node D". In other words, node D always want to have higher values of TP but at the same time, it can not force the node R further decrease it DP and in return increase the TP. Because if the node R (while reducing the value of DP) fails to successfully decode the received signal received from node S, then it will have nothing to forward to node D. In this case node D will have zero benefits and the whole communication session will result in failure.

So intuitively the optimal solution is that the node R should utilize the minimum required units of power (i.e. 4 units) for signal decoding and use the remaining power (i.e. 6 units) as TP.

So my question is that how can I prove that this is the optimal solution? Can I somehow claim this point as Nash Equilibrium point (and prove it mathematically if possible)? because beyond this point (or we can say increasing the value of DP) will not benefit any of the player (i.e. node R and node D).

Edit: `If we try to assign some playoffs then they will as follows:

  1. If the signal goes through, then the payoff is "data rates" at the destination.
  2. The source node S transmit power (STP) is fixed and also DP is fixed and if STP increases or DP decreases in both cases the success probability that the signal will go through increases.
  3. If signal didn't go through then the payoff will be the loss of resources i.e. power, frequency (radio resource), and time. I have also updated this information in the main question.

If not Nash Equilibrium then any other possible way to prove it?

Any kind of help will be very much appreciated.

1

There are 1 best solutions below

1
On BEST ANSWER

Here is what I consider a counterexample to your claim. I know nothing about wireless relays.

Assume the payoff to D is $$f(TP)K - TP$$ where K is the constant amount that D gets if the transfer goes through (0 if it doesn't arrive). $f(TP)$ is the probability that the signal goes through, and is increasing in TP. Then D's problem is to $$Max[f(TP)K-TP]$$. Take the derivative and get the first order condition $$K f'(TP)-1=0$$ and conclude $$f'(TP)=1/K.$$ Now specialize, and assume $f=\sqrt{TP/6}$. This implies $K=2 \sqrt{6}\sqrt{TP}$ or $TP=K/24$. Set K=10 on the grounds that that is the initial power. The intuition is that it is too expensive for D to relay with probability one of making it to R.

Now for some game theory> R should choose a sharing rule with D. So if R values the transmision at $h(TP)$ and gives $g(TP)$ to D. Then the problem is to choose the function$g(TP) $ to maximize R's payoff. $$Max_g h(TP)-g(TP)$$ $$s.t. g(TP\leq h(TP)$$ $$s.t. max_{TP} [g(TP)f(TP)-TP]$$

But solving that is non-trivial!