Which Constraints Should I Compare with First in Quadratic Optimization Problem?

77 Views Asked by At

I trying to minimize this objective function.

$$J(x) = \frac{1}{2}x^THx + c^Tx$$

First I thought I could use Newtown's Method, but later I found Gradient Descent, which is more suitable for this type of problem.

I first turn $J(x)$ into the deritivate.

$$\Delta J(x) = x^TH + c^T$$

And I want to find $x$ when $\Delta J(x)$ is small as possible. I could use gradient descent for that.

$$x_{n+1} = x_n - \gamma \Delta J(x_n)$$

In this example, I set the constraints limits $LB$ for lower bound and $UB$ for upper bound.

% We want to solve Ax=b with constraints (bounds)
A = [3 2; -6 5];
b = [4; 6];
x = linsolve(A, b) % Unconstrained 

% Create a quadratic forumla
H = A'*A;
c = A'*b;
% Create the derivative
dJdx = @(x) (x'*H + c')';

LB = [0.1;0.5]; % Lower bounds for x
UB = [1.0;1.4]; % Upper bounds for x
y = 0.001; % Learning rate
x = [1;1]; % Initial state
for i = 1:3000
  % Compute the next state
  x = x - y*dJdx(x);

  % Constraints
  for j = 1:length(A)
    % On x
    if(-x(j) > UB(j))
        x(j) = -UB(j);
    elseif(-x(j) < LB(j))
        x(j) = -LB(j);
    endif
  endfor

endfor
x = -x % This needs

But what if I want to have the constraints

$$LB <= x <= UB$$ $$LBA <= Ax <= UBA$$

If I first compare the $x$ constraints, then the $Ax$ constraints might not fit? Or is it a way to determine of these constraints cannot work with each other first?

Example, if I extend the Octave code.

% We want to solve Ax=b with constraints (bounds)
A = [3 2; -6 5];
b = [4; 6];
x = linsolve(A, b) % Unconstrained 

% Create a quadratic forumla
H = A'*A;
c = A'*b;
% Create the derivative
dJdx = @(x) (x'*H + c')';

LB = [0.1;0.5]; % Lower bounds for x
UB = [1.0;1.4]; % Upper bounds for x
LBA = [1; 1]; % Lower bounds for Ax
UBA = [4; 5]; % Upper bounds for Ax

y = 0.001; % Learning rate
x = [1;1]; % Initial state
for i = 1:3000
  % Compute the next state
  x = x - y*dJdx(x);

  % Constraints
  for j = 1:length(A)

    % For upper bound
    if(-(A*x)(j) > UBA(j) && -x(j) > UB(j))
        % What to write here?
    elseif(-(A*x)(j) < UBA(j) && -x(j) < UB(j))
        % No nothing here - Everything is normal
    elseif(-(A*x)(j) < UBA(j) && -x(j) > UB(j))
        x(j) = -UB(j);
    elseif(-(A*x)(j) > UBA(j) && -x(j) < UB(j))
        % What to write here?
    endif

    % For lower bound
    if(-(A*x)(j) > LBA(j) && -x(j) > LB(j))
        % No nothing here - Everything is normal
    elseif(-(A*x)(j) < LBA(j) && -x(j) < LB(j))
        % What to write here?
    elseif(-(A*x)(j) < LBA(j) && -x(j) > LB(j))
        % What to write here?
    elseif(-(A*x)(j) > LBA(j) && -x(j) < LB(j))
        x(j) = -LB(j);
    endif


  endfor

endfor
x = -x % This needs
1

There are 1 best solutions below

15
On BEST ANSWER

Denote the following

$$C_1 = \{x : LB <= x <= UB\}\,,$$ $$C_2 = \{x : LBA <= Ax <= UBA\}\,.$$

Note that $C_1$ and $C_2$ are closed convex sets. So, your problem becomes the following

$$\min_{x \in C_1\cap C_2} J(x)\,,$$

where $C_1\cap C_2$ is a closed convex set. The problem can be equivalently written as

$$\min_{x \in R^N} J(x) + I_{C_1\cap C_2}\,.$$

Now, you may use the Projected Gradient Descent, which is a special case of Forward-Backward Splitting.

The update step essentially involves

$$x_{n+1} = P_{C_1\cap C_2}(x_n - \gamma_n \nabla J(x_n))\,.$$

But, obtaining closed form in general is difficult, so to solve the subproblem you may use

  1. Projections onto convex sets (POCS, a.k.a Alternating projections)

  2. or Dykstra's projection algorithm

  3. or Douglas Rachford

  4. or ADMM.

Reference: Patrick L. Combettes, Jean-Christophe Pesquet - Proximal Splitting Methods in Signal Processing.