I'm trying to understand the following problem:
We are looking at an infinitesimal coordinate transformation $$ x^\mu \rightarrow x^\mu + \epsilon u^\mu(x), \space \epsilon \rightarrow 0 $$ and we are interested in it's effect on the line element from $x$ to $x+dx$ $$ ds^2 = g_{\mu\nu} dx^\mu dx^\nu $$ where $g_{\mu\nu}$ is the Minkowski metric.
The book I'm reading (Local Quantum Physics, Fields, Particles, Algebras by R. Haag) gives the following formula: $$ \delta ds^2 = \epsilon g_{\mu\nu} ( \frac{\partial u^\mu}{dx^l} dx^l dx^\nu + \frac{\partial u^\nu}{dx^l} dx^l dx^\mu). $$
However when I try and compute $\delta ds^2$ I don't get the $\epsilon$ constant:
\begin{split} \delta ds^2(x) &=& \frac{d}{d\epsilon} ds^2(x+ \epsilon u(x))|_{\epsilon=0} \\ &=& \frac{d}{d\epsilon} g_{\mu\nu} (dx^\mu + \epsilon( u^\mu(x+dx) - u^\mu(x))) (dx^\nu + \epsilon( u^\nu(x+dx) - u^\nu(x)))|_{\epsilon=0} \\ &=& g_{\mu\nu}((u^\mu(x+dx) - u^\mu(x)) (dx^\nu + \epsilon( u^\nu(x+dx) - u^\nu(x))) \\ &&+ (dx^\mu + \epsilon( u^\mu(x+dx) - u^\mu(x))) ( u^\nu(x+dx) - u^\nu(x))) |_{\epsilon=0} \\ &=& g_{\mu\nu} ((u^\mu(x+dx) - u^\mu(x)) dx^\nu + dx^\mu( u^\nu(x+dx) - u^\nu(x))) |_{\epsilon=0} \\ &=& g_{\mu\nu} (\frac{\partial u^\mu}{\partial x^l} dx^l dx^\nu + \frac{\partial u^\nu}{\partial x^l} dx^l dx^\mu) \\ \end{split}.
Any ideas what I'm doing wrong here? (Also let me know f this would be more appropriate for the physics stack exchange, I just thought the topic was more the computation than the underlying theory.)