Proving lower bounds from algorithmic game theory paper (specifically, price of anarchy is lower bounded by 3/2 for $m$ links)

160 Views Asked by At

This question is similar to Understanding proofs from paper on Game Theory (Price of Anarchy)

This question is about the same proof: proving the lower bound that the price of anarchy (sometimes called price of coordination). Theorem 1 from the above paper says:

"The price of anarchy for $m$ links is $\Omega(\log(n)/\log(\log(n)))$. For $m = 2$ links, it is at least $\frac{3}{2}$.

But the proof has me confused. Here is what I think they are trying to prove (the part that I'm interested in): the price of anarchy is at least $\frac{3}{2}$ when there are $m$ links (or another way to word this is $m$ machines to process process $n$ jobs).

But here is what I think they are actually proving:

$Given$$\,$ $m=n$ (the number of tasks to be completed, $n$, and the number of machines, $m$ to do them, or equivalently, we could word it as $n$ agents and $m$ edges/links) $and$ the amount of work each agent $i$ needs to send (or the amount of time needed for $i$ job) - call this $w_i$ - equals 1, the lower bound is 3/2.

It seems that they have not proved the price of anarchy is at least 3/2, but that given certain assumptions the lower bound is 3/2. Thus, it seems the proof is not as general as the authors claimed it to be.

FWIW, these assumptions seem to used in several places in the proof. To prove the lower bound for price of anarchy, you need to show that the expected maximum of the traffic over all edges is 3/2. They do this using balls and bins method. There are $m$ bins and $n$ balls, where $n=m$. So their proof requires $n=m$ (though I'm not sure if it requires the $w_i=1$, $\forall i$). I think the opt = 1 cost claim also depends on $m=n$ and $w_i = 1$.

Am I missing something?

Note: price of anarchy is a ratio of the cost worst case Nash equilibrium over best case.