In a real vector space, consider S and T to be disjoint, nonempty, convex subsets of the vector space and let a point x lie outside either set. How would I prove the following: co({x} union S) is disjoint from T or co({x} union T) is disjoint from S.
Remember, no topological assumptions have been made, so I don't think I can use the separating hyperplane theorem.
If neither of the alternatives in the desired conclusion held, you'd have some $t\in T$ that's in the convex hull of $S\cup\{x\}$ and you'd have some $s\in S$ that's in the convex hull of $T\cup\{x\}$. So, for suitable $s'\in S$, $t'\in T$, and $a,b\in(0,1)$, you'd have $$ t=ax+(1-a)s'\qquad\text{and}\qquad s=bx+(1-b)t'. $$ (The reason $a$ and $b$ are in $(0,1)$ rather than merely in $[0,1]$ is that $S$, $T$, and $\{x\}$ were assumed to be disjoint.) Solve those two equations for $x$ and set the two resulting expressions (one using $t$ and $s'$ and the other using $s$ and $t'$) equal to each other, so you get a (linear) equation in $s,s',t,t'$. Multiply both sides by $\frac{ab}{a+b-ab}$, and note that $a+b-ab$ is positive because it equals $1-(1-a)(1-b)$. Transpose some terms and get $$ t\cdot\frac b{a+b-ab}+t'\cdot\frac{a(1-b)}{a+b-ab}=s\cdot\frac a{a+b-ab}+ s'\cdot\frac{b(1-a)}{a+b-ab}. $$ Notice that, on each side of this equation, the two fractions are positive and add up to $1$. So the left side is a convex combination of $t$ and $t'$ and therefore belongs to $T$. Similarly, the right side is in $S$. But this contradicts the assumption that $S$ and $T$ are disjoint.