Recently I ran across the problem of trying to solve the following equation, where $\varepsilon \in (0,1)$ and $x \geq 0$:
$$ x^{1+\varepsilon} - x - 2\pi = 0. \tag{eq}$$
Luckily, for me it suffices to know that the unique zero of this equation is between $1$ and $2\sqrt[\varepsilon]{2}$ when $\varepsilon < \frac 1 2 $, but I am now curious; are there methods of solving the exact roots of such "almost polynomials"?
EDIT: As a closing comment, in case I or someone else needs to approximate the solution in the future, here's an ad hoc argument to show that for any $\beta\in(0,1)$ there exists $\varepsilon_0>0$ such that for all $\varepsilon < \varepsilon_0$ the unique positive root of equation (eq) is between $\varepsilon^{-\beta}$ and $\varepsilon^{-1}$.
Denote \begin{align*} h(x) = x^{1+\varepsilon} - x = x(x^\varepsilon - 1), \end{align*} whence $h([0,1]) \subset [-2,0]$, $h(1) = 0$, $h'(x) = (1+\varepsilon)x^\varepsilon -1 > 0$ for all $x > 1$ and $h''(x) > 0$ for all $x > 0$. Thus we know that in the interval $[1,\infty)$ there is a unique solution, denoted $S(\varepsilon)$, to the equation (eq) with $S(\varepsilon) > S(\varepsilon')$, when $\varepsilon < \varepsilon'$.
To estimate $S(\varepsilon)$ in terms of $\varepsilon$ observe that for any $\beta \in (0,1)$ we have \begin{align*} h( \varepsilon^{-\beta} ) = \varepsilon^{-\beta} \left(\varepsilon^{-\beta \varepsilon} - 1\right) =\frac{\varepsilon^{-\beta \varepsilon} - 1}{\varepsilon^\beta}, \end{align*} and by using l'Hopital's rule we see that \begin{align*} \lim_{x \to 0} \frac{x^{-\beta x}-1}{x^\beta} % &=\lim_{x \to 0} \frac{D (x^{-\beta x}-1)}{D x^\beta} &=\lim_{x \to 0} \frac{D (e^{-\beta x \log(x)}-1)}{\beta x^{\beta-1}} \\ &=\lim_{x \to 0} \frac{-x^{-\beta x} \beta (\log(x)+1)}{\beta x^{\beta-1}} \\ &=\lim_{x \to 0} -\underbrace{\left( x^x \right)^{-\beta}}_{\to 1} \underbrace{x^{1-\beta} (\log(x)-1)}_{\to 0} = 0. \end{align*} This means that for any $\beta < 1$ and all $\varepsilon > 0$ small enough, $h(\varepsilon^{-\beta}) \leq 2\pi$ and thus $S(\varepsilon) \leq \varepsilon^{-\beta}$. Similarly, by repeating the calculations above with $\beta = 1$ we see that \begin{align*} \lim_{x \to 0} \frac{x^{- x}-1}{x} &=\lim_{x \to 0} -\underbrace{\left( x^x \right)^{-1}}_{\to 1} \underbrace{(\log(x)-1)}_{\to -\infty} = \infty > 2\pi. \end{align*} Thus $S(\varepsilon) \geq \varepsilon^{-1}$ for $\varepsilon > 0$ small enough.
Nope. Even if we know $\varepsilon$ to be a fraction, $\varepsilon=\tfrac ab$, we see the equation $x^{1+\varepsilon}-x-c=0$ reduces to
$$x^{a+b}=(x+c)^b$$
which we can't (always) solve, as we can only consistently solve $4$th-or-less degree polynomials exactly.