Suppose we have $x,y\in R^n$ and $x^Ty\geq 0$. Let $M\in R^{m\times n}$. We know that $M^TM$ is positive semi-definite matrix.
Can we infer that $x^TM^TMy\geq 0$? If no, is there any condition that makes this hold? Thanks in advance.
Suppose we have $x,y\in R^n$ and $x^Ty\geq 0$. Let $M\in R^{m\times n}$. We know that $M^TM$ is positive semi-definite matrix.
Can we infer that $x^TM^TMy\geq 0$? If no, is there any condition that makes this hold? Thanks in advance.
On
Something with the flavor of a converse of your statement that is true is that, given vectors $x,y\in V$ for some inner product vector space $V$, there exists a positive-definite operator $M$ taking $x$ to $y$ (i.e. $Mx=y$) if and only if $\langle x,y\rangle> 0$. For the forward direction $\langle x,y\rangle = \langle x,Mx\rangle > 0$. For the reverse direction, just take the appropriate projection matrix onto $y$, $M=yy^T/\langle x,y\rangle$, which is PD by assumption, and satisfies $Mx=y$.
Edit: one silly condition for your question to work can be obtained by looking at the eigendecomposition of $M$. Namely, if $M = \sum_{i=1}^n\lambda_i v_iv_i^T$ for an orthonormal eigenbasis $\{v_i\}$, then what we want is equivalently expressed as $\sum_{i=1}^n \alpha_i\beta_i\lambda_i^2\geq 0$ where $\{\alpha_i\}$ are the coefficients of $x$ in the $v_i$ basis, and $\beta_i$ are those of $y$. Maybe you can derive some sufficient condition convenient for your situation from this?
This certainly isn't always true. Counterexample: $x=(1,0)^T,\ y=(1,1)^T,\ M=\pmatrix{1&-2\\ -2&4}^{1/2}$.
If $x^TM^TMy\ge0$ whenever $x^Ty\ge0$, then by considering $x=e_i$ and $y=e_i+ke_j$ with $i\ne j$ for an arbitrary $k$, we see that $M^TM$ must be a diagonal matrix. But this property should still hold if we perform an orthogonal change of basis on $\mathbb R^n$. It follows that $M^TM$ must be a scalar multiple of the identity matrix, i.e. $M$ is the scalar multiple of some matrix that has orthonormal columns (and if $M\ne0$, this implies $m\ge n$).