I want to find explicit formula for the sequence $f_n$ of polynomials which uniformly convergent to $\exp(\sin(x))$ on $[0,2014]$. Taylor's expansion is terrible for this function so i think that there are other ways, but these ways...
Approximating $\exp(\sin(x))$ with polynomials
1k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 3 best solutions below
On
If you want to approximate functions by polynomials, I'd recommend you look at the chebfun system.
Suppose we are given a function $f$ that we want to approximate by polynomials (in the unform norm). For any given degree $n$, the best approximation is difficult to construct. You need some sort of numerical procedure, like the Remez algorithm, so there's not likely to be a closed-form formula. But, you can get an approximation that's close to optimal by constructing Lagrange interpolants based on Chebyshev sites, and for these you can obtain explicit formulae. This is how chebfun works. The details are here, especially Theorem 1 in section 4.5.
Approximations based on Taylor series are often quite poor; they are very good at one point, but deteriorate rapidly as you move away from this favored point.
On
This is not the answer you wanted but technically it is a series of polynomials with an explicit formula.
If you want to show such a polynomial exists as @Winther noted substitute $\sin(x)$ into the taylor series of $e^{x}$
(1) $$\sum_{i=0}^{\infty}\frac{\left(\sin(x)\right)^{i}}{i!}$$
Then substitute the taylor series of $\sin(x)$ into (1).
(2) $$\sum_{i=0}^{\infty}\frac{\left(\sum_{j=0}^{\infty}(-1)^{j}{\frac{x^{2j+1}}{\left(2j+1\right)!}}\right)^{i}}{i!}$$
Note that if (1) is uniformally convergent and if the taylor series of $\sin(x)$ $\left(\sum_{j=0}^{\infty}(-1)^{j}{\frac{x^{2j+1}}{\left(2j+1\right)!}}\right)$ is uniformally convergent so is (2).
(2) seems is impossible to simplify into single sum but it shows using a series of polynomials with an explcit formula is possible. If you think a simplification could be possible you can try.
I can't help myself to start by ranting that using polynomials to approximate periodic functions over many periods is a really bad idea. For this particular example we have $\frac{2014}{2\pi} \approx 320$ periods of the function repeated over $[0,2014]$ so any polynomial that approximates it well would need to have order $\gg 640$ ($640$ just to match the number of extremal points of $f'$ and much more is needed for the approximation to be good on the whole interval). For practical applications such a high order polynomial over such a large interval would be close to useless as truncation errors would dominate a computation of it.
Anyway, here are some alternatives for how to construct the sequence you are after:
Taylor series:
Since $f(x) = e^{\sin(x)}$ is an entire function the Taylor series about any point converges for all $x$ and the convergence is uniform on a bounded interval like $[0,2014]$. The problem as you note is to write the explicit form. One way you can do this is to write
$$e^{\sin(x)} = \sum_{n=0}^\infty \frac{\sin^n(x)}{n!}$$
Now expand $\sin^n(x)$ in a Fourier series (perhaps simplest to use De Moivres formula to do this) $\sin^n(x) = \sum_{k=0}^n a_k\cos(kx) + b_k\sin(kx)$ and finally expanding $\sin(kx)$ and $\cos(kx)$ in a Taylor series. You will not get the most pretty result, but it should work.
Weierstrass theorem and Bernstein polynomials:
If we only need to demonstrate that the sequence you mention exists we can use Weierstrass approximation theorem. Since $f$ is a continuous function and $[0,2014]$ is a compact interval then for any $\epsilon > 0$ there exist a polynomial $P_\epsilon(x)$ such that $\sup_{x\in[a,b]}|P_\epsilon(x) - f(x)| < \epsilon$. Note that this is non-constructive, it only gives the existence of such a polynomial without any information about how it looks like. However if one looks at typical proofs of the theorem one can find such a method (usually via Bernstein polynomials, but there are other possibilities).
For example we have the following theorem: if $g$ is a continuous function on $[0,1]$ then
$$B_n(g)(x) = \sum_{k=0}^n g\left(\frac{k}{n}\right)b_{k,n}(x)$$
where $b_{k,n}(x) = {n\choose k} x^k (1-x)^{n-k}$ are Bernstein basis polynomials, converges uniformly to $g(x)$ on $[0,1]$ (see this page for a proof). Applying this theorem to the function $g(x) = f(2014x)$ gives the desired sequence.
Lagrange interpolating polynomials:
A third possibillity would be to write down the $n$’th Lagrange interpolating polynomial of $f(x)$ with grid points $x_i = \frac{2014i}{n}$ for $i=0,1,\ldots,2014$, namely $$P_n(x) = \sum_{k=0}^nf(x_k)\prod_{i=0,i\not= k}^n\frac{x-x_i}{x_i-x_j}$$ However it’s not always guaranteed that such a polynomial will converge uniformly. The given $f(x)$ might have the right properties for this, but this will have to be checked.