Why do we need the notion of a compact set to define a game?

359 Views Asked by At

$\textit{Definition:}$ $I$ is a finite set of players and and $G=((S^i)_i,g)$ is a compact game, that is given by a compact set of strategies $S^i$ for each player $i$ and by a continuous payoff function $g:S=\times S^i \to \mathbb{R}^{I}$. Also the set of mixed strategies is defined as $\Sigma^i=\Delta(S^i)$ which is a standard way in game theory and $g$ is extended to $\Sigma$ by $g(\sigma)=\mathbb{E}_{\sigma}g(s)$.

$\textit{Question 1:}$ Why do we need the notion of compact game (set) from topology? Can anybody give the intuition and/or an example?

$\textit{Question 2:}$ From the best of my knowledge, the index $\sigma$ in the operator of the expected value, i.e. $\mathbb{E}_{\sigma}$ denotes the probability measure of the environment that we work. In this case $\sigma$ stands for the mixed strategy which is a probability distribution over the set of pure strategies (if I am not mistaken). Does this mean that $\sigma$ coincides with the probability measure?

I updated my question. Thank you in advance!

1

There are 1 best solutions below

3
On BEST ANSWER

While it's hard to be certain about this assumption without seeing the full context, compactness is often assumed in order to guarantee existence of a solution to a problem. I'm assuming that that optimality of $g$ over $(S^i)_{i\in I}$ describes a solution of your game? If so, then the game is guaranteed to possess a solution, since continuous functions always achieve their extrema on compact sets.

If the assumption of compactness is dropped, we must use much more technical arguments to verify that a solution exists.