Background
I was naively playing around with some interpolation ideas once again and came across the dual numbers as a way to perform differentiation implicitly. Naturally, I thought, okay, perhaps there's some way for which we can map some dual numbers to the real numbers using a polynomial function. As it turns out in general, this is not the case, and I'm struggling to understand why.
Symptoms
For example, if we consider the function $f$ such that $f(0)=0$ and $f(\varepsilon)=1$ we know that if $f$ is an analytic function, then:
$$ f(\varepsilon)=f(0)+f'(0)\cdot\varepsilon=1\implies f'(0)\cdot\varepsilon=1 $$
Obviously there is no solution to this. So perhaps we could employ the use of say, a Lagrange polynomial, which when simplified would give:
$$ f(x) = \frac{x}{\varepsilon} $$
Again, this makes absolutely no sense; multiply numerator and denominator by $\varepsilon$ and we have division by zero. Under the real numbers this expression might 'work' (using $\frac{a}{a}=1$, $\frac{0}{a}=0$ for non-zero $a$), but in this context, not so.
As a last ditch, I thought we could use the matrix representation of a dual number; so we could reframe our problem as something like taking a function $F$ such that: \begin{align*} F\begin{pmatrix} 0 & 0 \\ 0 & 0 \end{pmatrix} = \begin{pmatrix} 0 & 0 \\ 0 & 0 \end{pmatrix} \\ F\begin{pmatrix} 0 & 1 \\ 0 & 0 \end{pmatrix}=\begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix} \\ \end{align*}
However, now the problem becomes that not all functions which would interpolate these values termwise (i.e. $A_{1,1}\to B_{1,1}$) are well-defined.
Question
Obviously up until now I've listed off a couple of things that I've tried, but I suspect they're merely symptoms of a same underlying problem with attempting to perform a mapping like this. Putting aside the feasibility of the outcome of something like this, is there any particular reason that this idea fails from the very beginning?