Let $X \subset \mathbb{R}^m$ be a finite set of points, classified by a function $f:X \to \{-1, 1\}$. We say that a hyperplane $H = \{x \in \mathbb{R}^m : \langle x, v \rangle + c = 0\}$ is a separating hyperplane for $X$ when $\forall\ x \in X, sign(\langle x, v \rangle + c) = f(x)$.
Let $S, T \in \mathbb{R}^m$ be separating hyperplanes, determined by the equations $\langle x, w \rangle + b = 0$ and $\langle x, w^* \rangle + b^* = 0$, respectively.
Do there necessarily exist continuous functions $\phi:[0, 1] \to \mathbb{R}^m$ and $\psi:[0, 1] \to \mathbb{R}$, such that $$\begin{align}\phi(0) &= w\\\phi(1) &= w^*\\\psi(0) &= b\\\psi(1) &= b^*\end{align}$$ and such that $\forall\ k \in [0, 1], H_k = \{x \in \mathbb{R}^m : \langle x, \phi(k) \rangle + \psi(k) = 0\}$ is a separating hyperplane?
In other words, given two separating hyperplanes, can one go from one to the other continously (a-la homotopy), only passing through other separating hyperplanes?
The question arises from wondering if one could use a single layer Perceptron (trained by the Perceptron learning algorithm) to approximate the SVM solution for a given training set $X$. The latter is more expensive to compute than the former, so it would be interesting if there were a way to go from one to the other.
If I am not wrong with calculations, this set is even convex, not only path-connected.
Let $H_0$ and $H_1$ be two separating hyperplanes generated by linear functions $f_0 = \langle x, v_0 \rangle + w_0$ and $f_1 = \langle x, v_1 \rangle + w_1$: $H_0 = \lbrace x: f_0(x) = 0 \rbrace$, $H_1 = \lbrace x: f_1 (x) = 0 \rbrace$. Consider a linear function $f_\alpha = (1 - \alpha) f_0 + \alpha f_1 = \langle x, (1- \alpha) v_0 + \alpha v_1 \rangle + (1-\alpha)w_0 + \alpha w_1$. Since for any $x$ follows ${\rm sgn} f_0 (x) = {\rm sgn} f_1 (x) = f(x)$, then convex combination $f_\alpha$ of $f_0$ and $f_1$ has the same sign as $f(x)$. So, $H_\alpha = \lbrace x : f_\alpha(x) = 0 \rbrace $ is separating hyperplane.