Suppose we have $M=xx^T$ where $x$ is a random vector in $\mathbb{R}^n$. Also, we know that $x=q+e$ where $q$ is distributed according to $D$, i.e., $q \sim D$ and $e$ is a bounded vector. Therefore, $M=xx^T=qq^T+qe^T+eq^T+ee^T$.
Given $M$, can we decompose it to $M= Q+ N$ where $Q$ is a symmetric matrix that captures all the property of $qq^T$ and $N$ is a noise matrix?
I think my question has been already solved but I do not know where I can find it or find any material to handle it?
If you want to approximate a matrix $M$ by a rank $k$ matrix, a standard thing to do is compute the SVD $M = U \Sigma V^T$ of $M$, set all but the top $k$ singular values equal to $0$ to obtain a new diagonal matrix $\tilde \Sigma$, then put the pieces back together:$Q = U \tilde \Sigma V^T$ is a rank $k$ approximation to $M$. It is guaranteed to be the best rank $k$ approximation in some sense.
Because you have $k = 1$, you can use power iteration to estimate the largest eigenvalue $\lambda$ of $M$ and a corresponding eigenvector $q$. Normalize $q$ so that it's a unit vector. Then $Q = \lambda qq^T$ is the best rank 1 approximation to $M$.