Let $G=(V,E)$ be a graph with $n$ vertices and minimum degree $\delta>10$. Prove that there is a partition of $V$ into two disjoint subsets $A$ and $B$ so that $|A|\leq O(\dfrac{n\ln\delta}{\delta})$, and each vertex of $B$ has $\geq 1$ neighbor in $A$ and $\geq 1$ neighbor in $B$.
[Source: The probabilistic method, Alon and Spencer]
I would like to set up a probabilistic argument here, but not sure how to start.
I propose this algorithm for the partition:
This may not be a probabilistic argument, but at least it's an argument!
Edit: Oops, I completely missed $|A| \leq O(\frac{n \ln \delta}{\delta})$. Perhaps this would work:
This will put at least $\delta$ times as many vertices into $B$ as $A$ at first, but then will taper off. I'm not sure how to ensure that $|A| \leq O(\frac{n \ln \delta}{\delta})$, but this is a possible starting point.