Is there any kind of expansion of $f(x)=\frac{1}{1-x}$, possibly with polynomials, such that with only a few terms I can represent with an error smaller than $10\%$ the function over the interval $[0, 1)$?
I understand that for the Taylor polynomial, if I expand around $0$, I need many terms as the derivative of the function $f$ increases quickly as $x \rightarrow 1$.
Edit:
My goal is:The fact is that the function for my particular problem is $\frac{1}{1-x_{l_1} x_{l_2}}$ where $x_{l_1}, x_{l_2}$ are two functions in the fourier space and $l_2 = L-l_1$. And I would like to apply the convolution theorem using an expansion. The problem is that $0<x_{l_1} x_{l_2}<1$ can be near $0.999$ and I have to use a lot of terms.
So if I have: $\int dl_1 \frac{1}{1-x_{l_1} x_{l_2}}$ this would be $\sum_n \int dl_1 P_n(x_{l_1},x_{l_2})$ for some polynomials, such that for each polynomial I can apply the convolution theorem, i.e. $F[F^{-1}[]_{l_1} \cdot F^{-1}[]_{l_2}]$, with $F$ fourier transform and $F^{-1}$ its inverse.
You can sample points $(x_i, f(x_i))$ in the region from $[0,1)$ with a higher density in towards the $x\to 1$.
Then use a polynomial regression for the dataset $\mathcal{D}=\{(x_1, f(x_1)),\ldots,(x_N, f(x_N))\}$ with $N$ data points. The problem with polynomials will be that the error is unbounded for $x\to 1$.
You could start by fitting a quadratic polynomial with the regression equation
$$f(x_i)=w_0+w_1x_i+w_2x_i^2+\varepsilon_i$$