Let $f\colon A \times B \rightarrow \mathbb{R}$. I am reading a paper in which some quantity of the form $$ \inf_{a\in A}\sup_{b\in B}f(a,b) $$ is equal to $$ \inf_{a\in A^{\prime}}\sup_{b\in B^{\prime}}f(a,b) $$ where $A^{\prime}\subset A$ and $B^{\prime}\subset B$.
To establish equality, the author claims it is sufficient to show that for all $\epsilon>0$, $a\in A\setminus A^{\prime}$, and $b\in B\setminus B^{\prime}$ there exists $a^\prime\in A^{\prime}$ and $b^\prime\in B^{\prime}$ such that $$ \left|f(a,b)-f(a^{\prime},b^{\prime})\right|\leq\epsilon. $$ However, I am not sure I follow this argument. Is this true? Why?
Let $A' = A$ be some subset of $\mathbb R$, $B' = [0,1]$, $B = [0,2]$ and $f(a,b) = b$.
Then $$\inf_{a\in A} \sup_{b\in B} f(a,b) = \inf_{a\in A} \sup_{b\in [0,2]} b = 2$$ and $$\inf_{a\in A'} \sup_{b\in B'} f(a,b) = \inf_{a\in A'} \sup_{b\in [0,1]} b = 1.$$
These quantities are not equal, but $A\setminus A' = \emptyset$, so the stated condition is true.