I have come across an optimization problem with the following objective function:
$$f(x_0,y_0,z_0,x_1,y_1,z_1,...,x_N,y_N,z_N) = \sum_{i=0}^N f_i(x_i,y_i,z_i, \alpha(x_{i+1}-x_i) + \beta(y_{i+1}-y_i))$$
i.e. the objective function is the sum of the functions $f_i$ that only depend on three variables $x_i,y_i,z_i$ and on a linear combination of the difference to the direct neighbors (for $x_i$ and $y_i$). So far I have tired an NLCG algorithm, but convergence is very slow.
Is there a specialized solver that can exploit the structure of the optimization problem?
I have thought about introducing new variables $s_i = \alpha(x_{i+1}-x_i) + \beta(y_{i+1}-y_i)$ and writing $$f(x_0,y_0,z_0,s_0,...,x_N,y_N,z_N,s_N) = \sum_{i=0}^N f_i(x_i,y_i,z_i,s_i)$$ because then the optimization could be conducted separately for each $f_i$ but I would have to introduce equality constraints to get back the original problem?
$f_i(x_i,y_i,z_i,s_i)$ is nonconvex (there is a $\cos(s_i)$ term in the function); however $f_i(x_i,y_i,z_i)$ is convex