In problem 4.7 of Griffiths' "Introduction to electrodynamics, 4th Edition", to find the potential energy of a dipole in an electric field $\vec{E}$ the following step is made:
$$\lim_{\vec{d}\to 0} \,\, \int_{\vec{r}}^{\vec{r}+\vec{d}} \vec{E} \cdot d\vec{x} = \vec{E} \cdot \vec{d}$$
Are there some more complete intermediate steps for this reasoning (I'm imagining something involving a first order Taylor expansion or similar, perhaps)? How can I show this step is valid for non-uniform $\vec{E}$-fields (or is it not)?
The last line in the image below is the step I am referring to.

Mean value theorem for integration
$$\int_a^{a+h} f(x) \, dx=hf(\xi)$$
for some $\xi \in (a,a+h)$.