I am working on a game theory assignment where a game $G$ is played on a graph. Let $G = (V,E)$ where $V$ is a set of all vertices and $E$ a set of all edges connecting said vertices. Two players Adam and Eve take alternating turns moving from vertex to vertex along the existing edges. More information about the game exists but is not relavant for my question.
An infinite play $P$ is won by Eve if some position (a vertex that they moved to) repeats infinitely often, otherwise Adam is the winner. I want to write this condition down as formally as I can and I came up with the following
$$\forall p\in V, \sum_{i \in P:i=p}^\infty 1 = \infty$$
Is this a valid notation? Is there a better/more efficient/more easily understandable way of writing the winning condition?
Forgive me if I got the formatting wrong, this is my first time posting anything math related on any stack exchange community.
The first thing you should do is to give a formal definition of a play. It could be defined as a sequence $(v_n)_{n \geqslant 0}$ of vertices of your graph such that, for each $n$, $(v_n, v_{n+1})$ is an edge of the graph. Thus a play is just an infinite path $p = v_0 \rightarrow v_1 \rightarrow v_2 \rightarrow \dotsm\ $ in the graph. You can now define $$ \text{Inf}(p) = \{v \in V \mid v \text{ occurs infinitely often in the sequence $(v_n)_{n \geqslant 0}$} \} $$ Now your winning condition is usually expressed as a condition on $\text{Inf}(p)$.