I am struggling to prove a statement which I think should be fairly easy. I have two sequences of compact convex subsets $A_j, B_j \subset \mathbb{R}^n$ with $A_j \to A$ and $B_j \to B$. Here $\to$ is convergence with respect to the Hausdorff metric (induced by the Euclidean metric on $\mathbb R^n$). I am trying to prove that $A_j \subset B_j$ for all $j$ implies that $A \subset B$.
Here are the definitions I am using: $$d(a,B) = \inf_{b\in B}{d(a,b)}$$ $$d(A,B) = \sup_{a\in A}{d(a,B)}$$ $$h(A,B) = \max\{d(A,B), d(B,A)\}$$
We then say $A_j \to A$ if $h(A_j, A) \to 0$ as $j \to \infty$.
Progress: I first proved that $A\subset B$ if and only if $d(A,B) = 0$ and tried to show the latter. I noted that, because $d$ is continuous and $A,B$ are compact then there exists $a^\prime, b^\prime$ such that $$d(A,B) = \sup_{a\in A}{d(a,B)} = d(a^\prime, B) = \inf_{b\in B}{d(a^\prime,b)} = d(a^\prime, b^\prime)\text{.}$$ I then used the triangle inequality to show $$d(A,B) \leq d(a^\prime, a_j) + d(a_j, b^\prime)$$ for any $a_j \in A_j$. However, I then incorrectly said that this implies that $$d(A,B) \leq d(a^\prime, A_j) + d(b^\prime, A_j)$$ and both the terms on the RHS go to zero as we take $j \to \infty$. This would work, if the previous step was true! I have tried a few other similar things but can't seem to get it to work - any help would be appreciated.
Suppose there exists a point $p\in A\setminus B$. Let $r$ be the distance from $p$ to $B$ (which is positive since $B$ is compact). Pick $j$ such that $h(A_j,A)<r/3$ and $h(B_j,B)<r/3$.
The $r/3$ - neighborhood of $p$ contains a point $q\in A_j$. Since $A_j\subset B_j$, we also have $q\in B_j$. Hence, the $r/3$ - neighborhood of $q$ contains a point of $B$. By the triangle inequality, $d(p,B)\le 2r/3 $, a contradiction.
The convexity of sets is not needed.