I'm trying to understand whether for every symmetric matrix $A \in R^{nxn}$ and every $b \in R^n$ there exists vector $x \in R^n$ such that $Ax+b=\lambda x$ for some $\lambda$.
I'm not sure if there is a proper name for this.
Here is what I thought of so far. By the spectral theorem, there is a basis ${v_1,v_2,...,v_n}$ of $R^n$ where ${v_1,v_2,...,v_n}$ are eigenvectors of A. We also know that $Av_i=\lambda_i v_i$ where $\lambda_i$ is an eigenvalue corresponding to $v_i$. Since we know where the basis vectors go after the transformation we can find out where all the others will go as well.
$$x \in R^n \implies x=\sum_{i=1}^{n} \alpha_iv_i$$
$$x \in R^n \implies x=\sum_{i=1}^{n} \beta_iv_i$$
$$Ax=A\sum_{i=1}^{n} \alpha_iv_i=\sum_{i=1}^{n} \alpha_iAv_i=\sum_{i=1}^{n} \alpha_i\lambda_iv_i$$
so we are looking for $\lambda$ that satisfies $$0=Ax+b-\lambda x=\sum_{i=1}^{n} \alpha_i\lambda_iv_i+\sum_{i=1}^{n} \beta_iv_i-\sum_{i=1}^{n} \lambda\alpha_iv_i= \sum_{i=1}^{n} (\alpha_i(\lambda_i-\lambda)+\beta_i)v_i$$
$v_i$'s are linearly independent so $\alpha_i(\lambda_i-\lambda)+\beta_i=0$ $\forall i$
Here is where I'm stuck I'm not sure how to prove or disprove that this is possible.
For any $A,b$ if you select $\lambda$ that is not an eigenvalue of $A$ then $(A-\lambda I)$ is invertible and hence letting $x=-(A-\lambda I)^{-1}b$ will do the trick.