Grothendieck's inequality guarantee of relaxation for Semidefinite problem

173 Views Asked by At

I am struggling to understand a proof of theorem 3.5.6 in Roman Vershynin's High-Dimensional Probability

Theorem:

$$\text{INT}(A) = \max_{x_i = \pm 1 \text{ for } i = 1,\ldots,n} \sum_{i,j=1}^{n} A_{ij}x_ix_j. \tag{3.20} $$

$$\text{SDP}(A) = \max_{\|X_i\|_2 = 1 \text{ for } i = 1,\ldots,n} \sum_{i,j=1}^{n} A_{ij} \langle X_i, X_j \rangle. \tag{3.21} $$

Consider an $n \times n$ symmetric, positive-semidefinite matrix $A$. Let INT(A) denote the maximum in the integer optimization problem (3.20) and SDP(A) denote the maximum in the semidefinite problem (3.21). Then

$$\text{INT}(A) \leq \text{SDP}(A) \leq 2K \cdot \text{INT}(A) $$

The proof given: The first bound follows with $X_i = \{ x_i ,0 \dots ,0 \}$. The second bound follows from Grothendieck’s inequality for symmetric matrices in Exercise 3.5.3. (Argue that one can drop absolute values.)

My understanding:

I understood the first bound, that we can reduce SDP(A) to INT(A) with this example and say that we can map any problem in INT(A) to SDP(A) and thus proof the first bound. However I can't understand the second one, if we used Grothendieck's inequality does that mean we assume the condition of the inequality. i.e.

$$\forall x \in \{-1, 1\}^n: \left| \sum_{1 \leq i,j \leq n} a_{ij}x_i x_j \right| \leq 1$$

and if we do, why is it less than or equal 2K INT(A) and not just 2K? and what would be the case if we don't have the assumption?

1

There are 1 best solutions below

2
On BEST ANSWER

I am not sure what exercise 3.5.3 says about the Grothendieck inequality. I base my reply on this statement of it. The Grothedieck inequality says (using the same notation as in your post): If $$ \|A \|_{\epsilon} =\max_{x_i = \pm 1 \text{ for } i = 1,\ldots,n} |\sum_{i,j=1}^{n} A_{ij}x_ix_j| \leq1. $$ Then $$\|A\|_{H^\prime}= \max_{\|X_i\|_2 = 1 \text{ for } i = 1,\ldots,n} | \sum_{i,j=1}^{n} A_{ij} \langle X_i, X_j \rangle | \leq K_G, $$ where $K_G$ is the Grothendieck constant. Given an arbitrary matrix $B$ we can set $A= B / \|B \|_{\epsilon}$. Then $\|A\|_\epsilon \leq 1$. And so Grothendieck gives: $$ \| B\|_{H^\prime} \leq K_G \|B\|_\epsilon. $$ Since obviously $\mathrm{SDP}( \cdot) \leq \| \cdot \|_{H^\prime}$ it is left to show that $\| \cdot \|_\epsilon \leq 2 \, \mathrm{INT}( \cdot) $ in the case of symmetric positive semi-definite matrices.