Polynomial approximation for $f$ induces an approximation to $\sqrt f$?

131 Views Asked by At

Assume $f:[0,1] \rightarrow \mathbb{R}$ satisfies $f(t)\geq 0, f(0)=0$

I am looking for a machinery, which given a polynomial approximation of $f$ of a certain degree, determines the highest order of polynomial approximation which can be achieved for $\sqrt{f}$. (Based on the coefficients of the approximation for $f$).

Also, I would like an algorithmic way to determine the coefficients of the approxiamtion for the $\sqrt{f}$.

Example: assume $f(t)= at^2+bt^3+ct^4 + R(t)$ where $(\frac{R(t)}{t^4}) \stackrel{t\rightarrow 0}{\longrightarrow} 0, R(0)=0$.
Note that for such an $f$, $\sqrt{f(t)}$ is differntiable at $0$ and $\frac{d}{dt}\sqrt{f(t)}|_{t=0}=\sqrt a$.

A particular question: For which such functions $f$ (that is for which coefficents $a,b,c$) there exists a polynomial approximation up to third order for $\sqrt{f(t)}$ ? Does the answer depends only on $a,b,c$?

That is there exists $e,f$ such that:

$\sqrt{f(t)}= \sqrt a t+et^2+ft^3 + \tilde R(t)$ where $(\frac{\tilde R(t)}{t^3}) \stackrel{t\rightarrow 0}{\longrightarrow} 0$.

Is there a simple way to deduce the values of $e,f$ as functions of $a,b,c$? (other than writing an expression for the approximation , squaring it, comparing coefficients etc)

Note:

We cannot use the result on Taylor's polynomials for composition of functions, since $g(t)=\sqrt t$ is not differentiable in $0$.

Also, it is interesting to find what changes if we require existence of approximation for $\sqrt{f(t)}$ up to lower order than 3? do we get more functions for which this holds? What changes if we assume existence of approximations of different orders for the original function?

Lastly, note that the existence of $n$-order approximation does not imply $f$ is $n$ times differentiable. If we do assume some degree of differentiability, does it help?

1

There are 1 best solutions below

0
On

I will try a standard technique for fiddling with power series.

If $g(t) =\sqrt{f(t)} $, then, differentiating, $g'(t) =\frac {f'(t)}{2\sqrt{f(t)}} =\frac {f'(t)}{2g(t)} $ so $2g'(t)g(t) = f'(t) $.

If $f(t) =\sum_{n \ge 0} a_n t^n $ and $g(t) =\sum_{n \ge 0} b_n t^n $, then $f'(t) =\sum_{n \ge 1} na_n t^{n-1} =\sum_{n \ge 0} (n+1)a_{n+1} t^{n} $ and $g'(t) =\sum_{n \ge 1} nb_n t^{n-1} =\sum_{n \ge 0} (n+1)b_{n+1} t^{n} $.

Multiplying the series for $g(t)$ and $g'(t)$, and equating this to the series for $f'(t)$ you get a recurrence which will allow the coefficients $b_n$ to be iteratively computed.

I will leave the details to you.

This technique can be used to get $e^{f(t)}$, $f^{\alpha}(t)$ for real $\alpha$, and similar results.