Matrix Optimization with vectors as variables

62 Views Asked by At

I am trying to find the encoding of each one of my samples $\vec{\phi}(x_i) \in \mathbb{R}^D$ such as it approximates $K \approx M(\vec{\phi})$. I though this equation could work: $$ \vec{\phi}^* = argmin_{\vec{\phi}} \lVert K - M(\vec{\phi}) \rVert $$ Where: $$ K = \begin{bmatrix} k(x_1-x_1) & \cdots & k(x_1-x_n) \\\\ \cdots &\cdots &\cdots\\\\ k(x_n-x_1) & \cdots & k(x_n-x_n) \end{bmatrix} \in \mathbb{R}^{nxn}$$ $$ M(\vec{\phi}) = \begin{bmatrix} f(\vec{\phi}(x_1), \vec{\phi}(x_1)) & \cdots & f(\vec{\phi}(x_1), \vec{\phi}(x_n)) \\\\ \cdots & \cdots & \cdots \\\\ f(\vec{\phi}(x_n), \vec{\phi}(x_1)) & \cdots & f(\vec{\phi}(x_1), \vec{\phi}(x_1)) \end{bmatrix} \in \mathbb{R}^{nxn} $$ $$ \vec{\phi} = ( \vec{\phi}(x_1), \vec{\phi}(x_2), ..., \vec{\phi}(x_n))\in \mathbb{R}^{nxD} $$ The information I have about my variables:

  1. Matrix $K$ is symmetrical and uses a Reproducing Kernel Hilbert Space $k(.\,.): \mathbb{R}^n \rightarrow \mathbb{R}$, and I have all of its values.
  2. Function $f(.\,.): \mathbb{R}^D \rightarrow \mathbb{R}$ usually is a dot product or cosine similarity, but in this case, I want it to be any kind of function (Should I theoretically restrict them? Where can I read about this?)
  3. Encoding generates an embedding $\vec{\phi}$ from an input $x_i$: [$\phi: \mathbb{R}^n \rightarrow \mathbb{R}^D$], where usually D >> n.

I am not sure what kind of keywords I should search for these kinds of problems, what norm I should be using, optimization methods, and increasing the efficiency of possible solutions.

Any suggestions for what should I look for will be welcome. Thank you very much!