Minimization of the squared difference of a generalized quadratic form and a given number

54 Views Asked by At

For $i,j\in [n]$, let $\alpha_{ij}\in\mathbb{R}$ be given. Consider the function $F:\mathbb{R}^d\times\cdots\times\mathbb{R}^d\to\mathbb{R}$ defined by $$ F(x_1,\cdots,x_n) = \sum_{i = 1}^n\sum_{j = 1}^n \alpha_{ij}(x_i,x_j) $$ where $(x_i,x_j)$ denotes the standard inner product of $x_i$ and $x_j$ in $\mathbb{R}^d$. I call this kind of function a generalized quadratic form since, if $d = 1$, the function is exactly a quadratic form. In fact, letting $A = [\alpha_{ij}]_{n\times n}$ and $S = (A+A^T)/2$, $F(x_1,\cdots,x_n)$ may be written compactly as $F(x_1, \cdots, x_n) = \text{tr}(XSX^T)$, where $X$ is the $d\times n$ matrix formed by putting $x_1,\cdots,x_n$ as its columns. My question is: given a fixed number, say $\alpha\in\mathbb{R}$, how do I solve $$ \min_{x_1,\cdots,x_n\in\mathbb{R}^d} (F(x_1,\cdots,x_n)-\alpha)^2 $$ and how do I find $x_1,\cdots,x_n$ that achieve the minimum? For my purpose, we may assume $S$ constructed above is positive semi-definite but I am also curious about the general situation. I am wondering whether this problem can be solved by playing with eigenvalues and eigenvectors of $S$. Any useful comments or direction to literature/topics would be appreciated.

1

There are 1 best solutions below

4
On BEST ANSWER

Note that $F'_{x_j}=2\sum_{i=1}^{n}a_{ij}x_i$. In general, $F'=tr(2XS)$. You can use this derivates to find the extreme points of $(F-\alpha)^2$. You need to solve

$$2(-\alpha+\sum_{i=1}^{n}\sum_{j=1}^{n} a_{ij}x_i x_j)(2\sum_{i=1}^{n}a_{ij}x_i)=0\quad\forall j \in 1...n$$

Check this counts, please. Probably, you will need to use the Newton's Method to solve It. In addition, check this related question

Multivariate Quadratic Regression