I know what exactly the SVM is and very clear about the principles and algorithms. But I'm curious about are all support vectors necessarily used in constructing weights $w$.
Let $w^Tx_i + b$ denote SVM model. Based on KKT dual complementarity condition,
$\alpha_i[y_i(w^Tx_i+b)-1]=0, i=1,\cdots,N,$
where $w$ and $b$ are weights and bias, $\alpha_i$ is Lagrange multiplier.
we know that if $\alpha_i \not=0$, then $y_i(w^Tx_i+b)-1=0$. And such a $x_i$ is called support vector. Sequentially, the estimation of weights $w = \sum_{i=1}^m\alpha_{(i)}y_{(i)}x_{(i)}$ where $x_{(i)}s$ are the support vectors.
What I'm confused about is:
Is it possible that some $\alpha_0=0$ and $y_0(w^Tx_0+b)-1=0$ hold simultaneously so that such a $x_i$ is a support vector itself, but is not used in constructing weights $w$, since $\alpha_0=0$.
Thanks for any ideas. :)