Distribution of a random variable when noise is a multiple of a constant

99 Views Asked by At

Let $y = ax + \eta$ represent a random variable, where $\eta$ takes values which are multiples of $\alpha$ or $\beta$, where $\alpha$ and $\beta$ are some known constants. For e.g: when $\alpha = -10$ and $\beta = 15$, then $y$ can have the following values: \begin{equation} y = ax - 10\\ y = ax - 20\\ y = ax \pm 30 \\ y = ax + 45 \\ \cdots \end{equation} What is the distribution of $y$ under such conditions? Assume that $a$ and $x$ are deterministic and $a$ is known. Can I use least squares method to estimate $x$ from a set of values of $y$?

1

There are 1 best solutions below

0
On

Can I use least squares method to estimate $x$ from a set of values of $y$?

Here are some fairly weak sufficient conditions ...

We have $$y_i = ax + \eta_i,\quad i \in\{ 1,...,n\},$$ where $a$ is known. The least-squares estimator for the unknown $x$ is easily found to be $\hat{x}=\overline{y}/a$ (assuming $a\ne 0$), where $\overline{y}=\frac{1}{n}\sum_{i=1}^ny_i.$

Thus: $$0= \frac{\partial}{\partial x} \sum_{i=1}^n{(y_i-ax)^2}=-2a\sum_{i=1}^n(y_i-ax)\implies \hat{x}=\frac{1}{a}\,\overline{y}=\frac{1}{a}\frac{1}{n}\sum_{i=1}^ny_i.$$

The following theorem and corollary are easy to prove (proofs omitted):

Theorem. If the $\eta_i$ are pairwise independent with means $E[\eta_i]<\infty$ and variances $V[\eta_i]<\infty$, then $$\begin{align} E[\overline{y}]&=ax + \overline{\mu},\quad\text{where } \overline{\mu}=\frac{1}{n}\sum_{i=1}^n E[\eta_i]\\ \\ V[\overline{y}]&=\frac{1}{n}\,\overline{\sigma^2},\quad\text{where } \overline{\sigma^2}=\frac{1}{n}\sum_{i=1}^n V[\eta_i]. \end{align}$$

Corollary. If the $\eta_i$ are pairwise independent with $\frac{1}{n}\sum_{i=1}^n E[\eta_i]=0$ and $V[\eta_i]<\infty$, then
$$\hat{x}=\frac{1}{a}\,\overline{y}=\frac{1}{a}\frac{1}{n}\sum_{i=1}^ny_i$$ is an unbiased consistent estimator for $x$.