This is probably a simple question, but I need some help.
Consider a vector $x\in \mathbb{R}^n$ and a real $n\times n$ matrix $A$. I'm interested in the set of $y\in\mathbb{R}^n$ such that $x,y,Ay$ are linearly dependent.
To rule out trivial cases when the vectors $x,y,Ay$ are linearly dependent for any $y$, I assume that:
- $n>2$
- $x\neq0$
- $A$ that is not a scalar multiple of the identity matrix $I_n$
- the column space of $aI_n-A$ is not spanned by $x$, for any real scalar $a$.
It seems to me that the set of $y\in\mathbb{R}^n$ such that $x,y,Ay$ are linearly dependent should have zero $n$-dimensional Lebesgue measure.
Is this correct, and how would I go about proving this?
My thinking so far:
Let $M_x=I_n-xx'/(x'x)$ be the orthogonal projection onto the orthogonal complement of $\operatorname{span}(x)$. What I need to do, I think, is to find the measure of the set of $y\in\mathbb{R}^n$ such that $M_x y$ and $M_x Ay$ are collinear, that is, the set of $y\in\mathbb{R}^n$ such that $M_x(aI_n-A)y=0$ for some $a\in\mathbb{R}$.
Now, for any fixed $a$, the set $$S_a=\{y\in\mathbb{R}^n:M_x(aI_n-A)y=0\},$$ has zero $n$-dimensional Lebesgue measure, because $M_x(aI_n-A)\neq0$ by the assumption I've made above that the column space of $aI_n-A$ is not spanned by $x$ for any real scalar $a$.
But does the set $$\{y\in\mathbb{R}^n:M_x(aI_n-A)y=0 \text{ for some } a\in\mathbb{R} \},$$ (an uncountable union of the null spaces $S_a$ over $a$) have zero $n$-dimensional Lebesgue measure?
EDIT: The set of $y$ for which $(x,y,Ay)$ has rank $\le 2$ is an algebraic variety (the set where the determinants of all $3 \times 3$ submatrices are $0$), so it either has measure $0$ or is the whole space. Thus in a counterexample, $(x,y,Ay)$ would have rank $\le 2$ for all $y \in \mathbb R^n$.
Suppose $x, A$ was a counterexample. Then $Ux, UAU^{-1}$ would also be a counterexample for any invertible $n \times n$ matrix $U$. Thus we can assume wlog $x$ is the unit vector $e_1$. Writing (corresponding to the first entry and the other $n-1$) $$y = \pmatrix{y_1\cr u\cr},\ A = \pmatrix{a & b^T\cr c & D}$$ we have $$ (x, y, Ay) = \pmatrix{1 & y_1 & a y_1 + b^T u \cr 0 & u & y_1 c + D u}$$ This has rank $\le 2$ iff $(u, y_1 c + D u)$ has rank $\le 1$. If that is the case for all $y$, it is in particular the case when $y_1 = 0$, i.e. $(u, Du)$ has rank $\le 1$ for all $u$. That says $Du$ is always a scalar multiple of $u$. Now if $D v = \alpha v$ and $D w = \beta w$ with $\beta \ne \alpha$ and $v, w$ nonzero, then $v$ and $w$ are linearly independent and $D (v + w) = \alpha v + \beta w$ is not a scalar multiple of $v+w$. So we must have $\beta = \alpha$, i.e. $D = d I$ for some scalar $d$. Now $(u, y_1 c + D u) = (u, y_1 c + d u)$ has the same rank as $(u, y_1 c)$, so if this rank is always $\le 1$, we must have $c = 0$. Now that leaves $$ A = \pmatrix{a & b^T\cr 0 & dI},\ A - dI = \pmatrix{a-d & b^T\cr 0 & 0\cr}$$ which has its column space spanned by $x$, contrary to the assumption.