Aitken's Delta-Squared Process transforms the sequence $s=(s_0,s_1,s_2,...,s_n,...)$ into the sequence $S=(S_0,S_1,S_2,...,S_n=s_{n+2}-(s_{n+2}-s_{n+1})^2/(s_{n+2}-2 s_{n+1}+s_n),...)$.
If $\Delta s=(\Delta s_0,\Delta s_1,\Delta s_2,...,\Delta s_n=s_{n+1}-s_n,...)$ and $\Delta^2 s=(\Delta^2 s_0,\Delta^2 s_1,\Delta^2 s_2,...,\Delta^2 s_n=\Delta s_{n+1}-\Delta s_n,...)$ then $S_n=s_{n+2}-(\Delta s_{n+1})^2/\Delta^2 s_n$. It reminds me Newton–Raphson.
Note that the sequence $S'_n=(S'_{n,0},S'_{n,1},S'_{n,2},...,S'_{n,k}=s_n+\Delta s_n\sum_{i=0}^{k-1}(\Delta s_{n+1}/\Delta s_n)^i,...)$ is $S'_n=(s_n,s_{n+1},s_{n+2},s_{n+2}+(\Delta s_{n+1})^2/\Delta s_n,...)$ and converges to $S'_{n,\infty}=S_n$ like a geometric series. It makes me think that Aitken's $\Delta^2$ Process formula is to approaches infinite series with linear convergence and transformed sequence can have superlinear convergence.
Is that right?
How do Alexander Aitken discovered the $\Delta^2$?
And how about "delta-squared processes" for higher convergence orders than Aitken's?
Possibly with short formula using $\Delta^\theta s_n=\Delta^{\theta-1} s_{n+1}-\Delta^{\theta-1} s_n$ in greater $\theta$.
I don't find something like it.
I haven't found it so far, but I've thought a few things that might lead to something good.
Foundation
If a sequence $s=(s_0,s_1,s_2,...,s_n,...)$ converges to $s_\infty$ with order of convergence $\theta \ge 1$ and rate of convergence $0 < \mu < 1$, then $\lim_{n \to \infty}\frac {|s_{n+1}-s_\infty|}{|s_n-s_\infty|^\theta}=\mu$ $^{^{[Wiki/RateOfConvergence]}}$.
Seems that $(\forall n \in \mathbb N, s_n \in \mathbb R) \implies ( \theta \in \mathbb N^* \implies \lim_{n \to \infty}\frac {s_{n+1}-s_\infty}{(s_n-s_\infty)^\theta}=M)$ and $M=\pm\mu$. Presuming $s$ is such that $\forall n \in \mathbb N , s_{n+1}-s_\infty = M*(s_n-s_\infty)^\theta$ we have
including $s_\infty$ value in the solution, being $\theta$ setted or finded. This means that if $s$ is closer to the presumption ($\forall n \in \mathbb N , s_{n+1}-s_\infty \approx M*(s_n-s_\infty)^\theta$), $s_\infty$ is closer to the correct value (estimate the deviation would be good).
Specific order
Building two equations system with "$\{_{n=n+1}^{n=n}$" we have
$\left\{ \begin{aligned} s_{n+1}-s_\infty &= M*(s_n-s_\infty)^\theta \\ s_{n+2}-s_\infty &= M*(s_{n+1}-s_\infty)^\theta \end{aligned}\right. \implies$
$\left\{ \begin{aligned} M &= (s_{n+1}-s_\infty)/(s_n-s_\infty)^\theta \\ M &= (s_{n+2}-s_\infty)/(s_{n+1}-s_\infty)^\theta \end{aligned}\right. \implies$
$\left\{ \begin{aligned} \theta &= \log_{(s_{n+1}-s_\infty)/(s_n-s_\infty)}^{(s_{n+2}-s_\infty)/(s_{n+1}-s_\infty)} \end{aligned}\right.,$
then we can find
and other delta formulas for each $\theta$ value, but setting a $\theta$ value such that can't separate $s_\infty$ we need some numerical method to find it.
No order
Inserting one more equation "$n=n+2$" into the system, we have
$\left\{ \begin{aligned} s_{n+1}-s_\infty &= M*(s_n-s_\infty)^\theta \\ s_{n+2}-s_\infty &= M*(s_{n+1}-s_\infty)^\theta \\ s_{n+3}-s_\infty &= M*(s_{n+2}-s_\infty)^\theta \end{aligned}\right. \implies$
$\left\{ \begin{aligned} M &= (s_{n+1}-s_\infty)/(s_n-s_\infty)^\theta \\ M &= (s_{n+2}-s_\infty)/(s_{n+1}-s_\infty)^\theta \\ M &= (s_{n+3}-s_\infty)/(s_{n+2}-s_\infty)^\theta \end{aligned}\right. \implies$
$\left\{ \begin{aligned} \theta &= \log_{(s_{n+1}-s_\infty)/(s_n-s_\infty)}^{(s_{n+2}-s_\infty)/(s_{n+1}-s_\infty)}\\ \theta &= \log_{(s_{n+2}-s_\infty)/(s_{n+1}-s_\infty)}^{(s_{n+3}-s_\infty)/(s_{n+2}-s_\infty)} \end{aligned}\right.,$
so using $s_n$, $s_{n+1}$, $s_{n+2}$ and $s_{n+3}$ we can find $s_\infty$ (before $\theta$ and $M$) such that
$\log_{(s_{n+1}-s_\infty)/(s_n-s_\infty)}^{(s_{n+2}-s_\infty)/(s_{n+1}-s_\infty)}=\log_{(s_{n+2}-s_\infty)/(s_{n+1}-s_\infty)}^{(s_{n+3}-s_\infty)/(s_{n+2}-s_\infty)}$,
but how can we do it? Only numerical method? I think "yes" until now...