SVM after the training data has been linearly transformed

37 Views Asked by At

Let us assume that we have a solution $w^* \in \mathbb{R}^d$ for the following hard-margin SVM problem:

$$ \begin{array}{ll} \min_{w} & \left\|w\right\|^2\\ s.t. & y_i w^\top x_i\geq 1 \end{array} $$

Here, $y_i \in \{-1, 1\}$ and $x_i \in \mathbb{R}^d$, for $i \leq n$. Observe that I'm considering only separating hyperplanes containing the origin.

Let now $M$ be an invertible matrix. My question is, does $(M^{-1})^\top w^*$ solve the following hard-margin SVM problem?

$$ \begin{array}{ll} \min_w & \left\|w\right\|^2\\ s.t. & y_i w^\top Mx_i \geq 1 \end{array} $$

Observe that the problem above is the original SVM problem, but the training data is now $\{(y_i, Mx_i) \mid i \leq n\}$ instead of $\{(y_i, x_i) \mid i \leq n\}$.

1

There are 1 best solutions below

0
On BEST ANSWER

This is not true. Consider the SVM problem with a single constraint given by $y=1$ and $x=(\frac12,1)^\top$. The vector $w^*=(\frac25,\frac45)^\top$ is the unique optimal solution to this problem.

Let $M=\begin{pmatrix}1 & 0 \\ -2 & 1 \end{pmatrix}$. Then, $$(M^{-1})^\top w^*=\begin{pmatrix}1 & 2 \\ 0 & 1 \end{pmatrix} \begin{pmatrix} \frac25 \\ \frac45 \end{pmatrix} = \begin{pmatrix} 2 \\ \frac45\end{pmatrix}\enspace, $$ whereas the SVM problem with the transformed data $x'=Mx=(\frac12,0)^\top$ has the unique optimal solution $(2,0)^\top$.