Convexity of linear least squares if derivative of coefficients is added to objective

37 Views Asked by At

My math background is quasi non-existent, so please bear with me.

Context: I am implementing a method for spectral unmixing called MCR-ALS (Multivariate Curve Resolution - Alternating Least Squares) for a specific chemical use case. Essentially, I am recording spectroscopic data of a chemical reaction involving multiple components with different spectral fingerprints over time. Every experimental spectrum is a linear combination of the spectra of the pure components according to the corresponding concentrations at that point in time. The spectral fingerprints as well as the concentration time series are not known. MCR-ALS is commonly applied in such cases and it works like so:

  1. Start with an initial guess for spectra and time profiles.
  2. Iterate until convergence:
    • Set spectra as fixed and (linearly) fit the time profiles to the experimental data --> update time profiles.
    • Set time profiles as fixed and (linearly) fit spectra --> update spectra.
  3. Retrieve final result.

My aim: I know that the concentrations are going to change rather slowly. To avoid jittering in the concentration profiles I would like to add the square of their first order derivative (simply as (c_i[1:] - c_i[:-1])^2) to the objective to be minimized.

Now the question: Can I assume that the objective is still convex and use appropriate solvers?