Remove general drift from data? (Spectral Analysis)

26 Views Asked by At

I am working on separating a song from a moving source. The recordings all have a drift since the sound source itself is moving. This means that the information that I am interested in (the higher frequencies) has a drift with a low frequency. I don't have enough reputation to embed a picture, but it looks something like this: https://i.stack.imgur.com/31NRF.jpg

Where the red line is the drift (movement of the sound source) I am referring to, and the faster oscillations is the song that I am interested in.

Because all of my signals have a somewhat random drift, I would like to remove this and centralize it around $y=0$, in order to do the spectral analysis more consistent.

The easiest that comes to mind is of course something similar to a highpass filter, but that distorts the spectrum heavily. Other ideas are MA-filter on segments of the signal, and subtract that signal from the original one. Maybe model the low-frequency signal and subtract?

All of my alternatives seems to distort the spectrum quite a bit, and it is important that it is kept quite intact.

Do you have any ideas that could help me? Thank you very much.