If $w=\begin{bmatrix} w_1\\ \vdots\\w_n \end{bmatrix}$ is a vector in $K^n$ for a field $K$ and $A= \begin{bmatrix} \lambda & & \\ & \ddots & a_{ij}\\ & & \lambda \end{bmatrix}$ is a $n\times n $ matrix with entries on $K$ and $\lambda\in K^{*}$ then $Aw=\lambda w+e_i a_{ij} w_j\notin\langle w\rangle$ if $i\ne j$. And so $\langle A w\rangle\ne \langle w\rangle$ where $\langle w\rangle$ is the $1$ dimensional vector space generated by $w$.
But if $A$ is a general matrix in $GL_n(K)$ that is not of the form $\lambda I_n$ then $Aw=\sum\limits_{i=1}^n\sum\limits_{j=1}^n a_{ij}w_je_i$.
How to show that $Aw\notin \langle w\rangle$?
Can we decompose $A$ into $\sum A_{ij}$ where each $A_{ij}$ has a simple form like above (so $A_{ij}w\notin \langle w\rangle$) and say that $Aw=\underbrace{\sum A_{ij}w}_{\notin \langle w\rangle} \notin \langle w\rangle$?
For context the question asks to prove that $AW=W~~\forall W\in\{\text{sub-vector-spaces of } K^n \text{ with dimension 1}\} \iff A=\lambda I_n$ for some $\lambda\in K^{*}$
The condition $Aw\in\langle w\rangle$, applied to elements of the standard basis of $K^n$, implies that $A$ is a diagonal matrix. Now, for elements $w_1\neq w_2$ of the basis, being linearly independent, $Aw_1=\lambda_1 w_1$ and $Aw_2=\lambda_2 w_2$ together with $A(w_1+w_2)=\lambda(w_1+w_2)$ imply $\lambda_1=\lambda_2(=\lambda)$.