Given $A,B\subseteq\mathbb R^n$ two disjoint, nonempty, closed, convex sets, can we always find a vector $v\in\mathbb R^n\setminus\{0\}$ such that $\langle v,a\rangle<\langle v,b\rangle$ for all $a\in A,b\in B$?
Note that this requirement is weaker than other notions of strict separation (such as the existence of $\lambda\in\mathbb R$ such that $\langle v,a\rangle<\lambda<\langle v,b\rangle$ for all $a\in A,b\in B$).
Remarks:
- if a counterexample exists, $A$ and $B$ must be unbounded, with zero distance (i.e., $\inf_{a\in A,\,b\in B}|a-b|=0$);
- I'm pretty sure this is true when $n=1$ and $n=2$ (for $n=2$, I have an argument using the recession cones of $A$ and $B$), and I suspect this could be true in general.
Any counterexamples or references?