Let's say we have the following fixed-point problem: find $y\in\mathbb{R}^n$ that satisfies $$ y = p + f(y) $$ where $p\in\mathbb{R}^n$ is a constant and $f:\mathbb{R}^n\rightarrow\mathbb{R}^n$. In this case, $f(\cdot)$ has a special property, where it's second derivative is constant, i.e. $\partial^3 f_i / \partial y_j \partial y_k \partial y_l = 0$ (note that the subscript indicates the index of the element, e.g. $f_4$ is the 4-th output of $f(\cdot)$). Let's say we can code how to compute various things, such as jacobian-vector product and hessian-vector-vector(?) product, but cannot store the Jacobian matrix nor the Hessian 3D tensor explicitly.
Is there an algorithm that's designed specifically for this type of problem?
I know some algorithms such as Broyden, Anderson acceleration, etc, they are designed for general fixed-point or root-finding problem and they use the first-order approximation. The problem is, we need to tune the parameters of those algorithms to make it work for a specific case.