Simpler proof for: if the vector $x \neq 0$ and the matrix $M$ is symmetric then $x^TMx =0$ iff $M=0$

106 Views Asked by At

I was tasked to prove that if $M$ is a real symmetric matrix such that $$\boldsymbol{x}^TM\boldsymbol{x} = 0\ \forall\ \boldsymbol{x},$$ then $M=\boldsymbol{0}$.

I did manage to prove it in a rather round about way. I started considering $$\boldsymbol{x}^TA\boldsymbol{x} = a,$$ where $A$ is a real skew-symmetric matrix and $a \neq 0$ is a real number. Transposing both sides, we get $$\boldsymbol{x}^TA^T\boldsymbol{x} = a\\ \boldsymbol{x}^TA\boldsymbol{x} = -a,$$ from where we can conclude, by contradiction, that $a$ must be zero. Hence, it's proven that $$\boldsymbol{x}^TA\boldsymbol{x} = 0\ \forall\ \boldsymbol{x}.$$

Next, I considered the matrix $\Sigma$ which isn't symmetric or skew-symmetric. It can be noted that $\Sigma + \Sigma^T$ is symmetric since $(\Sigma + \Sigma^T)^T = \Sigma^T + \Sigma = \Sigma + \Sigma^T$ and that $\Sigma - \Sigma^T$ is skew-symmetric since $(\Sigma - \Sigma^T)^T = \Sigma^T - \Sigma = -(\Sigma - \Sigma^T)$. Furthermore, it can be verified that $\Sigma$ can be decomposed as $$\Sigma = {\Sigma + \Sigma^T\over 2} + {\Sigma - \Sigma^T\over 2},$$ that is, we can express $\Sigma$ as the sum of a symmetric and a skew-symmetric matrix.

Finally, I considered the equation $$\boldsymbol{x}^T\Sigma\boldsymbol{x} = 0,$$ which can be factored as $$\boldsymbol{x}^T\underbrace{\Sigma + \Sigma^T\over 2}_{\text{symmetric}}\boldsymbol{x} + \boldsymbol{x}^T\underbrace{\Sigma - \Sigma^T\over 2}_{\text{skew-symmetric}}\boldsymbol{x} = 0.$$ It was already shown that $\boldsymbol{x}^TA\boldsymbol{x}=0$ if $A$ is skew-symmetric, therefore, we have that $$\boldsymbol{x}^T{\Sigma + \Sigma^T\over 2}\boldsymbol{x}=0$$ which implies that $${\Sigma + \Sigma^T\over 2} = 0$$ meaning that for $\boldsymbol{x}^T\Sigma\boldsymbol{x}=0$ to be true for all $\boldsymbol{x}$, the symmetric factor of $\Sigma$, and the original matrix $M$, by extension, must be $\boldsymbol{0}$.

While my work does prove the first statement, I feel like it should be possible to prove it in many steps fewer, but I'm stumped and any pointers would be much appreciated.

5

There are 5 best solutions below

0
On BEST ANSWER

Given that $M$ is a real symmetric matrix we can decompose it as $$ M = Q \Lambda Q^T$$ where $Q$ is an orthogonal matrix whose columns are the real, orthonormal eigenvectors of $M$, and $\Lambda$ is a diagonal matrix whose entries are the eigenvalues of $M$. Let us pick a eigenvalue/eigenvector pair $\lambda$/$v$: $$Mv = \lambda v$$ Multiplying both sides with $v^T$ gives: $$0 = v^T M v = \lambda v^T v = \lambda \|v\|^2.$$ Given that $\|v\|\neq0$ by definition, we must conclude that $\lambda = 0$. We picked an arbitrary eigenvalue/eigenvector pair and so we have shown that every eigenvalue is $0$. Thus the eigenvalue matrix $\Lambda$ is the zero matrix, and in turn we have that $M = 0$.

0
On

We can write $x^TMx=\mathrm{tr}(Mxx^T)$ for an arbitrary matrix, where $\mathrm{tr}$ is the trace operator. Note that $xx^T$ is a symmetrical matrix.

We can split $M$ into a symmetric $S$ and a skew-symmetric $K$ part: $M=S+K$. So we have that: $$x^TMx=\mathrm{tr}(Sxx^T)+\mathrm{tr}(Kxx^T)$$ as the trace is a linear operator.

The trace term with $K$ is always $0$ as the trace of the product of a skew-symmetric and a symmetric matrix is always zero. So only the symmetrical part matter.

The first term is equivalent to $\sum_{ij}S_{ij}x_ix_j$. In order to be true for any $x$, we can first show that diagonal entries of $S$ must be zero (by choosing the $x$ as the unit basis vectors) and then each other entry must also be zero (by choosing the $x$ to be sum of each pair of unit basis vectors).

For example, $S_{11}$ must be zero because for $x=\{1,0,\dots\}$ the sum simplifies to $S_{11}=0$. Same goes for all other $S_{ii}$.

For $S_{12}$, we can use $x=\{1,1,0,\dots\}$ and get $S_{12}+S_{21}=0$ and as $S_{12}=S_{21}$ they must both be zero. This can easily be extended to show that any other $S_{ij}$ must be zero.

0
On

name $e_i$ the column vector with position $i$ set to $1$ and the other entries all $0$

Part 1: let $x = e_i$ so that $x^T M x = M_{ii}.$ This is zero, so $M_{ii}=0.$

Part 2: take $j \neq i.$ Let $x = e_i + e_j$ Then $$x^T M x = M_{ii} + M_{ij} + M_{ji} + M_{jj}.$$ We already know that $ M_{ii}, M_{jj} = 0.$ So far we have $$ 0 = x^T M x = M_{ij} + M_{ji}$$ However, we are told the matrix is symmetric, so $$ 0 = x^T M x = M_{ij} + M_{ji} = 2 M_{ij}$$ and $M_{ij} = 0.$

0
On

The classical proof is already fairly simple. It makes use of the connexion between symmetric bilinear forms and quadratic forms and it works over any field of characteristic $\ne2$. Specifically, since $M$ is symmetric, we have $$ (u+v)^TM(u+v)-(u-v)^TM(u-v)=4u^TMv $$ for any vectors $u$ and $v$. So, if $x^TMx=0$ for all $x$ and the characteristic of the underlying field is not $2$, we must have $u^TMv=0$ for all $u$ and $v$. In turn, $Mv$ must be zero for every $v$. Hence $M=0$.

0
On

For any $x$, let $y= M x$ and develop $$ (x+y)^TM(x+y) - x^TM x - y^TM y = x^T M y + y^TM x = 2 \|M x\|^2. $$ The three terms on the left are 0, so $\|Mx\|=0$ and $Mx=0$ for every $x$.