Let $(N,S,u)$ be a strategic form game, where
i) $ N = \{ 1,2,3,... n \}$ is the set of players,
ii) $S_i$ is the set of strategies for the $i^{th}$ player, and $ S = S_1 \times S_2 ... \times S_n$
iii) $u_i : S \rightarrow \mathbb{R}$ is the utility function for the $i^{th}$ player, and $u = (u_1,u_2, ...,u_n)$ is a vector of the $n$ utility functions.
I wish to formally define the concept of a strong Nash equilibrium (SNE) for such a game. (A non-mathematical definition is given on Wikipedia. Link: https://en.wikipedia.org/wiki/Strong_Nash_equilibrium )
My Attempted definition:
Let $U \subseteq N$ be a non-empty subset of the $n$ players. If $s = (s_1,s_2,...,s_n) \in S$ is a strategy, then let $s_U$ denote the vector of strategies of all the players from U and let $s_{-U}$ denote the vector of strategies of all the players not from U. So $s = (s_U,s_{-U})$.
A strategy $s$ is in strong Nash equilibrium if $! (\exists T \subseteq N, T \neq \phi $ $\&$ $ \exists s' = (s'_T,s_{-T}) \in S, s'_T \neq s_T$ such that $u_i(s') > u_i(s) \forall i \in T $ ).
Is this correct? If not, could someone please give a mathematical definition of SNE without using non-vector matrices and simplices?