Nash equilibrium: comparing different definitions

335 Views Asked by At

A Nash equilibrium seems to be defined in different ways in different books. Sometimes a Nash equilibrium refers to a single strategy (Definition $1$ below) and sometimes a Nash equilibrium is defined for a pair of strategies (Definition $2$ below).

Definition $1$: A strategy $\mathbf{p}$ is a Nash equilibrium if it is a best reply to itself (e.g., Hofbauer and Sigmund Evolutionary Games and Population Dynamics, p. 62.)

Definition $2$: The pair $(\mathbf{p}, \mathbf{q})$ is a Nash equilibrium if $\mathbf{p}$ is a best response to $\mathbf{q}$ and $\mathbf{q}$ is a best response to $\mathbf{p}$ (e.g., Broom and Rychtar Game theoretical models in biology).

My question is this. How are these definitions related? What is the definition of Nash equilibrium that is used by the Nash existence theorem$?$


Further comments:

  1. I initially thought that the Definition 1 is a particular case of the Definition 2 for symmetric games, but I don't think this is true. For instance, consider a symmetric $2\times 2$ game with payoff matrix:

$$\begin{array}{ccc} & A & B\\ A& a & b\\ B& c & d \end{array} $$

where $a<c$ and $b>d$. According to Definition 1, neither $A$ nor $B$ is a Nash equilibrium (because neither $A$ nor $B$ are a best reply to itself). But according to Definition 2, the pairs $(A,B)$ and $(B,A)$ are Nash equilibria.

  1. Sometimes, Nash equilibrium is still defined in another way (as A. Sh. pointed out, this is a generalization of Definition 2 for n players):

Definition $3$: A Nash equilibrium is a strategy profile in which every player is playing a best response to the current strategy profile.

The above definition appears in Gintis Game Theory Evolving, p. 43. It is also the definition that appears in the Wikipedia article on Nash equilibrium: https://en.wikipedia.org/wiki/Nash_equilibrium#Formal_definition

2

There are 2 best solutions below

0
On BEST ANSWER

Definition 3 is the correct definition of Nash equilibrium of a general $n$-player game.

Definition 2 is a special case of Def 3 in a general $2$-player game.

Definition 1 is the definition of a Symmetric Nash equilibrium in a $2$-player game. And note that Symmetric Nash equilibrium is a proper subset of Nash equilibrium, and so strictly speaking, Def 1 should not be taken as a definition of Nash equilibrium.

0
On

The first definition refers to a special case of a symmetric game. The second one is more general and is a standard one, as far as I know.