I'm looking to find a mathematically rigorous exposition on how to solve the quadratic programming problem
$$\min ||x||^2 \textrm{ subject to } Ax\leq b$$
where $x\in\mathbb{R}^n$, $A\in\mathbb{R}^{m \cdot n}$, $b\in\mathbb{R}^m$. This particular optimization problem comes up when building a SVM classifier on a given training data set. I have seen many articles that reduce solving an SVM to solving this particular problem, but very few which actually go into the mathematical details of how to do so (rigorously). Seems like the only places for me to look are general texts on non-linear programming, but that feels like digging far more broadly and deeply than I should need to for this specific use case.
In terms of reading material, what's the quickest path to getting to a point where I can both program a solver and understand why it works? I have a good mathematical background (from a pure rather than applied background), and a rigorous understanding of e.g. the simplex algorithm in linear programming.
You might want to look at these notes, especially Section 23.2. They use Kuhn-Tucker conditions to solve this problem (similar to the Lagrange multiplier method). It explains the math and then has a numerical example.