I have a simple two-player game
\begin{array}{|c|c|c|} \hline & L & R \\ \hline T & 0, 1 & 0,0 \\ \hline B & 1,0 & 0,1\\ \hline \end{array}
How can I find that this game has infinitely many mixed Nash equilibria? I know how to show it graphically, but is there a way to also show it just mathematically?
I calculated that:
$EU(T)>EU(B)$ when $\pi_2<0$, $\pi_2$ is the probability that player 2 chooses $L$.
$EU(L)>EU(R)$ when $\pi_1>\frac{1}{2}$, $\pi_1$ is the probability that player 1 chooses $T$.
How can I deduce from this that infinitely many mixed strategies exist? Because all I can find is the pure NE $(B,R)$.
If player 1 choses B then player 2 is guaranteed to choose R. So player 1 will get a payoff of zero no matter what.
Suppose that player 1 chooses T with probability $p$ and B with probability $1-p$. For any $p$, this is an optimal strategy for player 1.