how can I get minimum error probability for this decision problem?

1.1k Views Asked by At

I have the decision problem for 4 hypotheses as follows: $$H_j: Y_k=N_k-s_{jk},\ k=1,2,\ldots,n;\ j=0,1,2,3.$$ where signals are $s_{jk}=E_0\sin(w_cT(k-1)+(j+\frac{1}{2})\frac{\pi}{2}).$ $$$$ In vector form: $$\equiv H_{j}: \underline{Y}=\underline{N}+\underline{s}_j;\ j=0,1,2,3.$$ $$$$ How can I find the minimum error probability for equally likely signals in i.i.d. $N(0,\sigma^2)$ noise. (Thess signals are not orthogonal). how can I cobtain orthonormal signals for solving this problem? Thank you in advance.

1

There are 1 best solutions below

4
On

Conditioned on the $j$-th signal being transmitted, the likelihood function of the observation $(Y_1, \ldots, Y_n)$ is proportional to $$\exp\left(-\frac{1}{2}\sum_{k=1}^n (s_{kj}-Y_k)^2\right)$$ Since the signals are equally likely to be transmitted, the_ minimum-error-probability decision rul_e is the same as the maximum-likelihood decision rule, viz.

Choose the hypothesis that has the largest likelihood

which in this instance means deciding that the signal $s_j$ that is closest in Euclidean metric to the observation $Y$ is the one that is most likely to have been transmitted. In other words, compute the four sums $$Z_j = \sum_{k=1}^n (s_{kj}-Y_k)^2, ~~j = 0, 1, 2, 3$$ and decide that signal $s_j$ was transmitted if $Z_j < \min_{i: i\neq j} Z_i$