Derivative of Fourier Transform to phase for Gradient descent

568 Views Asked by At

My Problem is the following: I have the complex signal over time $$f(t) = r(t)\cdot exp(i\phi(t)) \in \mathbb{C}$$ with the corresponding discrete Fourier Transform in frequency space $$s(\omega) = DFT(f(t)), \omega = [\omega_0, \omega_N]$$. My goal is now to change the phase $\phi(t)$ to maximize the signal $s(\omega)$ in a certain region $[\omega_a, \omega_b]$. The idea is to use gradient descent technique for that. I thought about a cost-function like: $$c(\omega) = abs \left( \frac{\sum_{i=a}^bs(\omega_i)}{\sum_{i=0}^a s(\omega_i)+\sum_{i=b}^Ns( \omega_i)}\right)$$ that is supposed to calc the ratio of the region that I want to maximize and the sourrounding. Therefore, I need to calculate the derivate $dc(\omega)/d\phi$, but i fail to do so. Can anyone help me?

figure showing the real part of the signal f(t) before and after changing the phase

figure showing the magnitude of the spectrum before and after changing the phase of the signal f(t)