How to bound a quadratic form involving absolute signs from above?

439 Views Asked by At

I have given a negative semidefinite symmetric matrix $M \in \mathbb{R}^{n \times n}$, yielded as the Hessian of a concave function, and two vectors $u,v \in \mathbb{R}^n$, where $u$ lies in the open simplex, i.e. $0 \leq u_i \leq 1$ and $u_1+...+u_n = 1$, and $v$ is arbitrary. Is it possible to bound

$\sum_{i=1}^n u_i \vert \sum_{k=1}^n (M_{ik} - \sum_{j=1}^n v_j M_{jk}) (u_k - v_k) \vert$

from above? It is clear that this adapted form can be bounded from below by using the triangle inequality but I should have a bound from above which can depend for instance on the eigenvalues of $M$ and/or the norms of $u$ and $v$. It can be assumed that $M$ is even negative definite such that it admits only negative eigenvalues. Can somebody help?

1

There are 1 best solutions below

0
On BEST ANSWER

Disclaimer: this answer is not precisely what you asked for, in that it uses the following quantity:

$$ \max_{i=1,\dots,n} \|M_i\|_2 $$

where $M_i$ is the $i$th column of $M$. Furthermore, I assume that $M$ is symmetric (you state that it is the Hessian of a function, and thus this doesn't strike me as an unreasonable assumption). Let $\|\cdot\|$ denote the Euclidean norm and let $\langle\cdot,\cdot\rangle$ denote the inner product on $\mathbb{R}^n$. Furthermore, let $\lambda_\max$ denote the absolute value of the most negative eigenvalue of $M$. Then the quantity you provided can be re-written as

$$ P = \sum_{i=1}^n \big| u_i\langle M_i,u\rangle -u_i\langle M_i,v\rangle - u_i\langle v, Mu\rangle - u_i \langle v,Mv\rangle\big| $$

Apply the triangle inequality. Then

$$ P \leq \sum_{i=1}^n \big| u_i\langle M_i,u\rangle\big|+\big|u_i\langle M_i,v\rangle\big| + \big|u_i\langle v, Mu\rangle\big| + \big|u_i \langle v,Mv\rangle\big|$$

$$ \leq \max_i \langle M_i,u \rangle+ \max_i\langle M_i,v \rangle + \langle v, Mu\rangle + \langle v, Mv \rangle$$

where we have used the fact that $\sum_i u_i = 1$. Now used the Cauchy-Schwarz inequality:

$$ P \leq (\|u\|+\|v\|)\max_i \|M_i\| + \|M\|\|u\|\|v\| + \|M\|\|v\|^2$$ $$ = (\|u\|+\|v\|)\max_i \|M_i\| + \lambda_\max\|u\|\|v\| + \lambda_\max\|v\|^2. $$

I'll admit this isn't terrific, but it's something. If you wished, you could invoke the equivalence of norms, and bounds the max expression using the 1-norm of the matrix $M$, since

$$ \|M\|_1 = \max_{j=1,\dots,n}\sum_{i=1}^n |M_{ij}| = \max_{j=1,\dots,n} \|M_j\|_1. $$

Not sure whether using the 1-norm makes any sense in your application. Alternatively, since $M$ is symmetric, you could use the $\infty$-norm.