Visualize signal drops

43 Views Asked by At

I have an input signal (blue) and a mean signal (green) which I compute from from previous signals). They looks like this:

diverge.png

(I also have the standard deviation signal and can do others if needed.)

From these signals I whish to clearly visualize two things:

  1. Short dips like the one pointed to by the arrow above; and
  2. Longer reduction in amplitude, say a 20% decrease in one period or more.

I've tried applying various functions to the difference between the two signals (Hilbert transformation among other things) but the oscillations make the curve hard to read, especially when zooming out. It's fair to assume the amplitude of the oscillations are pretty linear to the mean of the signal, as you probably can tell by just looking at the curves.

Is it possible filter out the noise but keep the short dips?

PS. I'm not too versed in math and would much appreciate a layman explanation alongside more stringent formulae.