Is dot product of two vectors always less than dot product of one of those vectors with a matrix transformation of the other?

145 Views Asked by At

For two vectors $\mathbf{u, v}$ with $n$ real positive entries each and an $n$-by-$n$ real symmetric matrix $\mathbf{M}$, I'm thinking that

$$\mathbf{u} \cdot \mathbf{M} \cdot \mathbf{v} > a\mathbf{u \cdot v}$$

Where $a$ is a real scalar between 0 and 1.

First, is this inequality true? Second, why?

EDIT:

The kind of matrix I have in mind for $\mathbf{M}$ is

$$\mathbf{M=X^{-1}D(q)X}$$

Where $\mathbf{X}$ is an $n$-by-$n$ symmetric matrix with real, non-zero entries, $\mathbf{D(q)}$ is a matrix whose diagonal elements are those of the vector $\mathbf{q}$ (which are generally between 0 and 1), and with zeroes elsewhere.

2

There are 2 best solutions below

0
On

From what I can tell, there's no general easy statement that we can make about the inequality based the information you've provided; you can get both directions depending on $M$ and $a, u, v$.

Instead, I'd suggest that you look at the eigenvalues of $M$. Specifically, let $z = \text{tr}(M)$. Then if we take $a \sim unif(0,1)$ and similarly for the entries of vectors $q, u, v$, then with probability $z/n$ we have the original direction.

I played around with the following MATLAB script below:

n = 4;                          % dimension
k = 1000;                       % trials
data = zeros(k,1);
traces = zeros(k,1);

for i = 1:k
    u = rand(n,1);
    v = rand(n,1);
    a = rand(1);
    q = rand(n,1);
    M = diag(q);
    traces(i) = trace(M);
    X = orth(rand(n));          % orthogonal; w.p. 1 invertible
    M = inv(X)*M*X;             % same eigenvalues as initial M
    data(i) = u'*M*v - a*u'*v;
end
sum(d>0)/k                      % probability
mean(t)

Try replacing $q$ with q = repmat(p, n, 1); to see the probability change to $p$.

0
On

The spectral theorem tells you $M$ always has real eigenvalues.

The answer below basically tells you that your statement is false so long as $M \neq I$.

For an indefinite or non-positive definite $M$, this statement is false (take $u=v$ from either the eigenspace of a non-positive eigenvalue of $M$; the left hand side is non-positive, the right hand side is positive).

Now, you only need to consider $M$ to be positive definite. Note that the left hand side is indeed an inner product of $u$ and $v$ given by $<u, v> = u^T M v$. (All inner products are of this form)

The first case is that an orthonormal basis for this innerproduct is not simply a rotated/scaled version of the standard basis. Then will be a pair of unit vectors $u,v$ such that $<u,v> = 0$ and $u,v$ are not orthogonal with respect to the usual inner product. So, the LHS will be zero, the RHS will be positive (possibly by negating either $u$ or $v$), showing the statement is false.

The remaining case is that $<.,.>$ provides an orthonormal basis which is a rotated/scaled version of the standard basis. If the orthonormal basis is just a rotated version, the two sides are equal, and we can take any $\alpha<1$ and have the statement be true. So, lets assume the basis is scaled and rotated version. Since the right hand side is invariant to rotation, we can apply the same rotation, and just consider the case that $M$ just provides an orthonormal basis which is a (positively) scaled version of the standard basis (negative scaling would introduce a reflection, which would provide a negative eigenvalue). Let $u$ have components $x_i$, $v$ components$y_i$. In this case, you're simply comparing the LHS ($\sum_i \alpha_i x_i y_i $) to the RHS ($\sum_i x_i y_i$) for some fixed set of positive scalars $\alpha_i>0$. For your condition to hold, you'd need $\sum_i (\alpha_i - 1) x_i y_i >0$ for all choices of $x_i,y_i$. But this isn't possible. Just take $u = sgn(\alpha_i -1) e_i$ and $v = -e_i$ for any $\alpha_i \neq 1$ (which exists by assumption that we arent simply rotating for an ONB wrt $M$).