Consider the following equation:
$$ y = \frac{1}{1+\mathrm{e}^{x}} + \frac{1}{1+\log(x)} $$
Is it possible to solve for $x$ analytically (as a function of $y$)? I guess this may not be possible, or is it?
Anyway, assuming this is not possible, then let us assume that $y$ is real, i.e., exclude the domain range (of $x$) that causes $y$ to have imaginary parts.
Let us further suppose that we know that ($x_1$,$y_1$) belong to the curve of this equation. My question is: is there a way to approximate $x_2$ for a given $y_2$ such that $\lvert y_1-y_2 \rvert < \delta$, where $\delta$ is a very small real number?
First of all, I'm fairly certain that there is no way to invert your function exactly. There are a number of ways to get around this, and I will investigate a few over the course of this answer. We will start with the most naive way, which is probably utilizing the McLaurin Expansion. For all $x<10$ we have that your function is roughly approximated by (correct to at least the integer part) $$y\approx\frac{1}{1+\log(x)} + \frac 12 \implies x\approx\frac{\log(y)+1}{2\log(y)+2}$$
This becomes increasingly accurate as we approach zero, and does so quite rapidly. However, as we quickly go further away we find that $y$ is much better approximated by $$y\sim\frac{1}{1+\log(x)} \implies x\sim e^{\frac{1}{y}-1}$$ In fact, this is better for all $x>1.0986$ (approximately), and has an error that decays to $4.663\times10^{-15}$ at a mere value of $x=\mathbf{33}$, which continues to decrease exponentially to $0$. This is due to the rapid growth of the exponential function, in contrast the extremely slow growth of the logarithmic function. As such, if your values of $x$ are anything greater than $1$ this is absolutely the way to go.
$$$$
Addendum
If you really desire a function that works decently and is a rational function then I recommend using Pade Approximations. They are based on Taylor series, but work decently well outside of the Taylor series' radius of convergence. For example, the error in the approximation $$\log(x)\approx\frac{-3+3x^2}{1+4x^2+x}$$ increases roughly by a value of $2$ for every $10\times$ increase in $x$. For example, the error at $x=10^{14}$ is $29.236$, and the error at $x=10^{15}$ is $31.539$. The error is $0$ in the neighborhood of $x=1$, and is fairly large for low values of $x$ (due to the short interval between powers of $10$). If this is an issue you could always calculate a higher order Pade Approximation... the Mathematica code is
where $f$ is the function, $x$ is the variable, $x_0$ is the point around which to approximate, and $n$ is the order of the approximation (highest power of $x$). Increasing $n$ will increase the accuracy, and you can always construct a Pade Approximation around a given point. However, this rapidly gets ugly, being too long to show accurately in MathJax. If this expression is so ugly, why do I mention it? Because a Pade Approximatin could augment the great approximation provided near the top of the post. This is absolutely true if you wish to approximate for values around $1$. Don't calculate the function's Pade Approximation as a whole though as I just did.... Your sanity will decrease rapidly. Instead, approximate $\log(x)$ using a Pade Approximation and approximate $e^x$ using a Taylor series, and then solve for $x$. If you drop unneeded terms this should provide a great approximation that can be solved exactly. The Taylor and Pade expressions can both be made fourth order and have $x$ solved for exactly, although a second order is WAY easier to work with.
Of course, this is likely a lot of overkill. For basically anything practical greater than $10$ or so the second approximation near the top will work perfectly fine, with the error already being on the order of $10^{-5}$, likely within most measurement precision. If all this doesn't do it for you, there are other techniques keeping track of the error in a Taylor or Pade approximation to get within $\delta$ of a value, but this will get really complicated and prone to impossibility really quickly based on restrictions imposed.