Optimizing over an infinite set of variables

206 Views Asked by At

This may be a very basic question, but it's been a while since I did any optimization.

Suppose I have a sequence $(x_i)$, $i=1,2,\ldots$ in the $\ell^2$ space and the following optimization problem:

$$\underset{\{x_i\}}{\operatorname{minimize}}\sum_{i=1}^\infty f(x_i)\\ \text{subject to}~\sum_{i=1}^\infty g(x_i)\leq C$$

Functions $f(x)$ and $g(x)$ are diffirentiable and convex. Can I still use the method of Lagrange multipliers here? If so, are there any additional steps that I need to take, such as showing certain properties for $f(x)$ and $g(x)$?

When I perform the standard steps, differentiating Lagrangian functional $\Lambda=\sum_{i=1}^\infty f(x_i)+\lambda\left(\sum_{i=1}^\infty g(x_i)-C\right)$ with respect to each $x_i$, and solving each $\frac{\partial\Lambda}{\partial x_i}=0$, I can show that the solution $(x_i^*)$ results in the converging objective function: $\sum_{i=1}^\infty f(x_i^*)>-\infty$ and that $\lambda\geq0$ exists such that the constraint is satisfied. Is there anything more I need to do?