Minimize a vector from a matrix operation

30 Views Asked by At

I want to minimize a certain vector that results from a matrix operation with some constraints and i don't exactly know how to tackle this problem. Lets say we have $$ (L+A)*s = v $$

  1. L is the laplacian matrix of size n x n
  2. s is a n x 1 vector
  3. v is the result vector
  4. A is a n x n matrix that only a pair of i,j of certain numbers is allowed to exist.

for example $$ \begin{pmatrix} 0 & 1 & 0\\ 1 & 0 & 0\\ 0 & 0 & 0\\ \end{pmatrix} $$

Here i,j = 0,1 and i,j= 1,0 and equal to 1. (only the value 1 can be given)

  • The matrix A can only take pairs from the following matrix

$$ \begin{pmatrix} 0 & 1 & 1\\ 1 & 0 & 1\\ 1 & 1 & 0\\ \end{pmatrix} + L $$

for example, if the result from above is: $$ \begin{pmatrix} 0 & 1 & 1\\ 1 & 0 & 0\\ 1 & 0 & 0\\ \end{pmatrix} $$

Only two pairs(matrices) can be taken as possible candidates.

$$ \begin{pmatrix} 0 & 1 & 0\\ 1 & 0 & 0\\ 0 & 0 & 0\\ \end{pmatrix} and \begin{pmatrix} 0 & 0 & 1\\ 0 & 0 & 0\\ 1 & 0 & 0\\ \end{pmatrix} $$

I want to find a way to get the best pairs so i can minimize the result vector v. (the best pair, the second best pair, etc.. ). What type of optimization problem is this and can gradient descent used here? in which way?