Hypothesis testing on Probability of False Alarm using Neyman-Pearson

31 Views Asked by At

We have binary hypothesis testing problem as follows:

  • $H_1$ (signal presence) : $y = s + n$

  • $H_0$ (signal absence) : $y = n$

Without loss of generality, to simplify the problem, let s be a constant representing the energy of the received radio waveform, and $n$ be the AWGN with zero mean and variance $σ^2$. There are two kinds of errors:

  • False alarm with probability of $P_{FA} = P_{01}$, also known as type I error, which means no signal but making a decision of signal presence.
  • Missing with probability of $P_M = P_{10}$, also known as type II error, which means signal presence but a decision of no signal being made.

The probability of detection is therefore $P_D = P_{11} = 1−P_{10}$ The optimization problem is therefore: for a given false alarm probability $P_{FA} = α$, maximize $P_D = P_{11}$. Since it is hard to define the cost function nor to know a priori probability, Neyman-Pearson criterion in decision making is adopted.

In other words, in this binary hypothesis testing with $A= {a_0,a_1}$, given $P_{FA} =∫_Xδ(a_1|y)f(y|H_0)dy = α$, $0 < α < 1$

we intend to find a decision rule δ(a1 |y) to maximize $P_D =∫_Xδ(a_1|y)f(y|H_1)dy$

where $X$ denotes the observation space and $y ∈X$.

The implementation of signal detection problem becomes a likelihood ratio test: if $\frac {f(y|H1)}{f(y|H0)} > K$, signal presence is claimed.

It leads to, if $y·s > \frac{||s^2||}2 +σ^2 log K = η$, signal presence is determined, where η is the decision threshold and can be determined from the false alarm probability.

For radar detection problem as $H_1$ and $H_0$, suppose $‖s‖^2$ = 100 and $n ∼ G(0,1)$. With $P_{FA} ≤ 10^{−2}$, how to detect if the signal presence or not?

Please help me on this. Thank you so much.