For simplicity, let's assume $f_i: \mathbb{R} \to \mathbb{R}$ is convex and define
$$ g(x) := \sum_{i=1}^{n}f_i(x) $$
Suppose we want to compute
$$ x^* := \arg\min_{x \in \mathbb{R}} g(x) $$
The Newton-Raphson iterations take the following form
$$ x_{t+1} = x_t - \left(\sum_{i=1}^{n}f''_i(x_t)\right)^{-1}\left(\sum_{i=1}^{n}f'_i(x_t)\right) $$
I was playing around and happened on the following idea. Suppose we solve \begin{align*} f_i'(x_i^*) = 0 \implies x_i^* = [f_i']^{-1}(0) \end{align*} Then, perform a Taylor series expansion for $f_i$ around $x_i^*$: \begin{align*} f_i(x) &\approx f_i(x_i^*) + f'_i(x_i^*)(x - x_i^*) + \frac{1}{2}f''_i(x_i^*)(x - x_i^*)^2 \\ &= f_i(x_i^*) + \frac{1}{2}f''_i(x_i^*)(x - x_i^*)^2 \end{align*} And so \begin{align*} g(x) \approx \sum_{i=1}^{n}[f_i(x_i^*) + \frac{1}{2}f''_i(x_i^*)(x - x_i^*)^2] \end{align*} and this approximation is minimized at
$$ \widetilde{x} = \frac{\displaystyle\sum_{i=1}^{n}f''_i(x_i^*)x_i^*}{\displaystyle\sum_{i=1}^{n}f''_i(x_i^*)} $$
From this, I had the idea to define $x_0 = \widetilde{x}$ and continue iterations of this manner:
$$ x_{t+1} = \dfrac{\displaystyle\sum_{i=1}^{n}f''_i(x_t)x_i^*}{\displaystyle\sum_{i=1}^{n}f''_i(x_t)} $$
How does this procedure compare against Newton-Raphson?