How to use
- Vector space $V = U \oplus W$
- Linear map $f: V \to V, x \mapsto \left \{ \begin{matrix} x ~~, & x \in U \\ 0 ~~, & x \in W \\ \end{matrix} \right. $
- $f = f \circ f$
to create
- Matrix $A \in M_{2,2}(\mathbb{Q})$
- $A \cdot A = A$
- $A \cdot \begin{pmatrix} 1 \\ 1 \end{pmatrix} = \begin{pmatrix} 1 \\ 1 \end{pmatrix}$
?
The definition of $f$ tells us that $U$ is its image and $W$ its kernel. Moreover, the restriction of $f$ to $U$ is the identity map. If we assume that both $U$ and $W$ are nontrivial, then with a suitable choice of basis the matrix of $f$ is $\operatorname{diag}(1,0)$, i.e., $$A = B\begin{bmatrix}1&0\\0&0\end{bmatrix}B^{-1}$$ for some invertible matrix $B$.
The condition $A(1,1)^T=(1,1)^T$ tells us that $(1,1)^T\in U$. Taking that as the basis for $U$, $$B = \begin{bmatrix}1&*\\1&*\end{bmatrix}.$$ Now, since $V=U\oplus W$, choose any vector that’s not a multiple of $(1,1)^T$ to generate $\ker f$. That vector goes in place of the *s above. For an orthogonal projection, pick a vector orthogonal to $(1,1)^T$.
A similar argument can be made in terms of eigenvalues. The definition of $f$ tells us that the possible eigenvalues of $A$ are $1$ and $0$, and that $(1,1)^T$ is an eigenvector with eigenvalue $1$. $A$ is therefore either the identity matrix (trivial kernel) or similar to $\operatorname{diag}(1,0)$. All that’s left is to choose a vector in the kernel of $f$.
If you allow $U$ or $W$ to be trivial, that opens up the possibilities $A=0$ and $A=I$. We can reject the first since we know of at least one vector that’s not mapped to $0$, but with $W=\{0\}$ we could have the identity matrix.