I have several equations $y_k = \boldsymbol{a}^T \boldsymbol{Q} \boldsymbol{x}_k, \forall k=1, \ldots, K$, where $\boldsymbol{a} \in \mathbb R^{3 \times 1}$, $\boldsymbol{Q} \in \mathcal{F} \triangleq \{ {\rm det}(\boldsymbol{Q})=1 ,\boldsymbol{Q} \boldsymbol{Q}^T = \boldsymbol{I} \}$ and $\boldsymbol{x}_k \in \mathbb C^{3 \times 1}$. Both $\boldsymbol{a}$ and $\boldsymbol{x}_k$ are pre-known. Now, I have an estimator $\boldsymbol{\hat{Q}} $ from the measurements $\{ y_k\}_k^K$.
The estimator $\boldsymbol{\hat Q}$ is given by $\begin{align} \boldsymbol{\hat Q} = \arg \min_{\boldsymbol{Q} \in \mathcal{F}} \sum_{k=1}^K \| y_k - \boldsymbol{a}^T \boldsymbol{Q} \boldsymbol{x}_k \|_2^2 \end{align}$
I want to analyze the effect on the estimator $\boldsymbol{\hat {Q}}$ if there is a disturbance $\boldsymbol{n} \in \mathbb R^{3 \times 1}$ on $\boldsymbol{x}$. For example, when there is a distrubance, the equation becomes $y_k = \boldsymbol{a}^T (\boldsymbol{{Q}} + \boldsymbol{\Delta}) (\boldsymbol{x}_k + \boldsymbol{n})$ where $\boldsymbol{\Delta} \triangleq \boldsymbol{\hat Q} - \boldsymbol{Q}$.
For example, when $\| \boldsymbol{n} \|_2^2 $ is small, the effect on $\boldsymbol{\hat Q}$ can be neglected. But I want to know how $\boldsymbol{\Delta}$ behaves analytically when $\boldsymbol{n}$ changes.
Does anyone know what name is the problem is and recommend some guidlines or textbooks if possible. Any comments would be appreciated! Thanks!