CUSUM Algorithm

401 Views Asked by At

One commonly used changepoint detection test is the CUSUM algorithm. It performs optimally when the pre-change mean and variance of the process are known, $μ_1$ and $\sigma_1$ is required. Then the statistic $S_j$ is defined as:

$S_0 = μ_1$

$S_j = \max \{ 0, S_{j−1} + x_j − kμ_1 \}$,

where $x_j$ is the $j$th observation of the process, $k$ is a control parameter, tuned depending on the magnitude of the change one is trying to detect. If $S_j$ exceeds some threshold $h\sigma_1$ (where $h$ is another control parameter), then a change is declared.

I have been asked the following question, I would like your ideas.

In a streaming environment, where we only store the most recent few observations and the underlying process may drift over time, suggest reasons why the CUSUM algorithm as described above may not be suitable.