We know how projected gradient descent works for bound constrained optimization (https://neos-guide.org/content/gradient-projection-methods). It is basically steepest descent with an additional requirement of projecting the point back to the feasible region at each step. I have a nonlinear objective function subject to bound constraints on each variable.
Can we tailor conjugate gradient or BFGS using projections to solve the problem. So instead of steepest descent, each step will do CG or BFGS step, and if the new point goes beyond feasible region, we project it back to original region using projection like the one suggested in referred link. I know BFGS-B exists but I can't find any general theory for CG based approaches. Do we have a generalized theory or convergence results for all gradient/Hessian based approaches added with projection for bound constrained optimization.