"Lagrange inversion" around points with $f'(x_0)=0$

169 Views Asked by At

In an older question to which I provided an answer it was asked how to compute a particular limit involving the roots of a transcedental function around it's extremum. This limit required the evaluation of several terms of a power series for each one of the two roots the function possesses around the minimum, which is reminiscent of a procedure akin to the Lagrange inversion theorem. However, as explicitly stated in the assumptions of the theorem, the derivative of the function has to be non-zero for the theorem to apply (the function has to be locally invertible). That requirement however, did not stop me from deriving a term by term expansion for both functional inverses around the maximum.

The reason why I was surprised is because I couldn't find any references on the subject, and this calculation seems to be a low-hanging fruit of a simple generalization to a well-known theorem.

My question is twofold, but an answer to either components will suffice:

  1. Assuming that around a point $x_0$, where $f'(x_0)=0, f''(x_0)\neq 0$ and given that $f(x)=\sum_n a_n(x-x_0)^n$, what are the coefficients $b^{\pm}_{n}$ of the series expansions of the two functional inverses $r^\pm(x)=\sum_{n=0}^{\infty}b_n^{\pm}(x-f(x_0))^{n/2}$, that satisfy $f(r^\pm (x))=x$? They also can be seen as the two solutions to the equation $f(x)=c$ in a neighborhood around $c=f(x_0)$. Is there a general formula for them (however formal?) Here I define $r^+(x)$ to be the root satisfying $r^+(x)>x_0$ and the other one such that $r^-(x)<x_0$.
  1. Has this structure been studied in the literature before? Does it come under a certain name? If not, what is it that makes this theorem difficult to establish or uninteresting?

My work: Experimenting with a couple simple functions and their behavior around double zeroes indicates that these coefficients are related by $b_n^-=(-1)^n b_n^+$. Also, these series expansions are obviously one-sided: they are defined for $x\in [f(x_0),R)$ if $f''(x_0)>0$ and for $x\in (R, f(x_0)]$ if $f''(x_0)<0$, for some value of $R$ representing a radius of convergence. I also computed the first few coefficients for arbitrary $f$ with a minimum @$x_0$

$$b_0=x_0~~,~~ b_1=\sqrt{\frac{2}{f''(x_0)}}~~, ~~ b_2=-\frac{f'''(x_0)}{3(f''(x_0))^2}$$

I also noticed that in a further generalization of the problem, when I demand that $f'(x_0)=f''(x_0)=... f^{(n-1)}(x_0)=0, f^{(n)}(x_0)\neq 0$, there are now $n$ functional inverses, but most of them represent complex roots around the extremum ($n-1$ if n is odd and $n-2$ if $n$ is even). This hint may be useful if one actually tries to formalize the theorem, since in the complex plane all branches will be included.

1

There are 1 best solutions below

2
On

There is such a thing as a Puiseux series. I once spent many hours learning how to calculate them very painfully by hand and use them to solve difficult implicit equations. They can be used to expand curves about points of singularity. They have use in algebraic geometry, apparently, but I don't know anything about that. The field of formal Puiseux series is importantly the algebraic closure of the field of formal Laurent series. If you want, I could try to spend some time digging up all the references where I learned of the proofs for this and of the computational technique, but for now I'll just showcase an example.

Unlike Lagrange-Burmann inversion, where a (variety of) surprising closed-form expressions for the coefficients are known, to the best of my knowledge there is no such closed form expression for Puiseux series. We must compute them by hand or by computer. I only know of one algorithm for doing so, it is called the 'Newton polygon' method (apparently Newton played around with these things back in the day). Formally, since they are the algebraic closure, any expression $F(x,y)$ which is a polynomial in $y$ with coefficients formal Laurent series has all of its roots expressible in Puiseux series. This means, in particular, we have another tool in our box for approximating implicit curves. Many inverse function problems can be cast as implicit function problems, so this is sort-of what you are asking. Moreover, Puiseux series are exactly of the form you describe with fractional exponents.

Example problem:

Locally expand $y^2-y-x^{-1}=0$ in a Puiseux series.

To start with, let's investigate why we can't always use the usual technique for expanding implicit curves:

Let's try at $x=-4,y=1/2$. Implicitly differentiating gives: $2y\cdot y'-y'+x^{-2}=0$. To find $y'(-4)$, we substitute $x=-4,y=0.5$ to get: $$2\cdot\frac{1}{2}\cdot y'-y'+\frac{1}{16}=0,\,\frac{1}{16}=0$$Uh oh...

No local solution of the form $\sum_{n\ge0} a_n(x+4)^n$ can then exist. It's also clear that the hypotheses of the implicit function theorem do not hold in any case, and plotting the curve reveals that $x=-4$ is a cusp-like point. Note that this is the same question as:

Find a local inverse of $f(y)=\frac{1}{y^2-y}$ near $y=0.5$.

And $f'(0.5)=0$, so we are in the situation of your post. I must note that the Puiseux series method would be successful for situations where $f^{(n)}(a)=0$. And yes, this is quite an easy function to invert via ordinary means, i.e. the quadratic formula, but I want to showcase a simple example. But, for example, I have used this method to successfully find local approximations to $y^3-y\cdot\tan x+\sin x=0$ (see below) which is definitely not simple to invert, though I wouldn't be sure how to cast this as an inverse function problem. We can also use it to expand curves at "points at infinity", e.g. this method could asymptotically expand $y^2-y-x^{-1}=0$ near $x=0$.

This is what I will showcase. To expand instead at $x=-4$, you'd just need to shift the curve so that a solution at the origin is found, i.e. replace $x$ with $x+4$. Note that usual implicit function methods can't be used for asymptotic expansion at $x=0$, at least, none that I know can do that.

I begin by casting our curve as $x^{\alpha_2}y^2-x^{\alpha_1}y-x^{\alpha_0}$. We are forced to set $\alpha_2=0=\alpha_1$ and $\alpha_0=-1$. The indices correspond to the exponent of $y$. Now for the 'Newton polygon':

Image 1

The shallowest blue line is a line of gradient $1/2$. We let $g$ equal the negative of the shallowest blue line, and the blue frame is the so-called Newton polygon. I obtained it by plotting $\alpha_{0,1,2}$ against $0,1,2$ and joining the points in a convex polygon. (I realise this is not very formal, but rigorous explanations for why this method works can be found, and I will dig them up upon request.)

Now - which points featured in the shallowest line? Only the $0$th order (constant $y^0$) and $2$nd order $(y^2)$ coefficients (recall this is a polynomial in $y$ with coefficients in Laurent series of $x$) played a part, so now we take an equation: $$(-1)c^0+(1)c^2=0$$Where the coeffients $(-1),1$ are precisely the coefficients of the leading terms in the Laurent coefficients that played a part. What? Well, say we had $y^2-y+(-x^{-4}-x^3+x^5)y^0=0$, I'd only consider the coefficient of $x^{-4}$, which is $-1$. Here, that's $(-1)x^{-1}$ that we consider, so I use $(-1)c^0$ ($c^0$ for the $y^0$ coefficient). Similarly, $y^2$ has a coefficient of $(1)x^0$, the leading term of which is $x^0$ with coefficient $1$. This will become important later.

Of course this equation has solution $c=1$. What do we do now? I substitute $y\mapsto cx^g+y=x^{-1/2}+y$ in our equation, and resolve this to the new curve: $$y^2+(2x^{-1/2}-1)y-x^{-1/2}=0$$Now we rinse and repeat. The coefficient of $y^2$ is still of order $0$ so $\alpha_2=0$. $\alpha_1=-1/2$ because the coefficient of $y^1$ is $2x^{-1/2}-1$, with leading term of order $-1/2$. $\alpha_0=-1/2$ too. The Newton polygon:

Image 2

The shallowest line has gradient zero, so I take $g=-0=0$. The contributing terms to this line were $\alpha_{0,1}$, and I take now the equation in $c$ with coefficients extracted as the leading terms of the corresponding coefficients in $y$... so the coefficient of $c^1$ is $2$ because $2$ is the leading coefficient of $2x^{-1/2}-1$, the coefficient of $y^1$. $$-c^0+2c=0,\,c=\frac{1}{2}$$Now I let $y\mapsto cx^g+y=\frac{1}{2}+y$ to obtain: $$y^2+2x^{-1/2}y-\frac{1}{4}=0$$I continue. So far, I have found: $$y\approx x^{-1/2}+\frac{1}{2}$$Which is an ok expansion of $y$ near $x=0$. You can continue this indefinitely.

WARNING: In solving equations for $c$, it will sometimes be the case that multiple (complex) solutions exist. You must pick a branch of the root (say, cube root) and stick to it. Then you will produce one, of potentially many, solutions.


$$y^3-y\cdot\tan x+\sin x=0$$Locally near $x=0$ has one solution (there will be two other complex branches) which is a Puiseux series beginning in: $$y=-x^{1/3}-\frac{1}{3}x^{2/3}+\frac{1}{81}x^{4/3}-\frac{1}{243}x^{5/3}+\cdots$$