A signal as a sum of sinusoids

220 Views Asked by At

I am doing some signal processing revision from my undergraduate studies and I came by an interesting fact that the book I am reading states.

Any signal can be written as the sum of a constant term and a sequence of sinusoids. That is, $x(t) = A_0 + \sum_k A_kcos(2\pi f_k+\phi_k)$.

Using Euler formula the last can be written as, $$x(t) = X_0 + \sum_{k=1}^{N} \big\{ X_ke^{j2\pi f_k} + X_k^*e^{-j2\pi f_k} \big\} = X_0 + \sum_{k=1}^{N} \big\{ X_ke^{j2\pi f_k} \big\} + \sum_{k=1}^{N} \big\{X_k^*e^{-j2\pi f_k} \big\}$$ where $X_0 = A_0$ and $X_k = A_ke^{j\phi_k}$.

What I want to prove is given

$$ a_k= \begin{cases} A_0, k = 0\\ A_ke^{j\phi_k }, k \neq 0\\ \end{cases} $$

the equation of signal composition can be written as:

$x(t) = \sum_{k=-N}^{N}a_ke^{j2\pi f_k t}$

Please let me elaborate on this by starting with what is needed to be proven. The equation $x(t) = \sum_{k=-N}^{N}a_ke^{j2\pi f_k t}$ consists of $2N+1$ terms and the expansion of this signal is:

$$x(t) = a_{-N }e^{j2\pi f_{-N} t}+ a_{-N+1} e^{j2\pi f_{-N+1} t} + \dots + a_{-1} e^{j2\pi f_{-1} t} + a_0 e^{j2\pi f_0 t} + a_1 e^{j2\pi f_1 t} + \dots + a_{N-1} e^{j2\pi f_{N-1} t}+ a_{N} e^{j2\pi f_{N} t}$$

It is easy to see that for the right half terms we have $\sum_{k=1}^{N}a_k e^{j2\pi f_k t} = \sum_k \big\{ X_ke^{j2\pi f_k} \big\}$. What is difficult to me is, how to prove $\sum_{k=-1}^{-N}a_{-k}^* e^{-j2\pi f_k t} = \sum_{k=1}^{N} \big\{ X_{k}^*e^{-j2\pi f_k} \big\}$.

I have started by making a change of variable, i.e, $\sum_{k=1}^{N}X_k^*e^{-j2\pi f_k} = \sum_{k=-1}^{-N}X_{-k}^*e^{-j2\pi f_{-k}}$. Thus, in order to make $\sum_{k=-1}^{-N}X_{-k}^*e^{-j2\pi f_{-k}}$ equal to $\sum_{k=-1}^{-N}a_ke^{j2\pi f_k}$ as far I can see we have subsequently to prove,

$$X_k = X_{-k}^{*} $$ and $$f_k=-f_{-k}$$

Am I right? Could you please help to understand this proof both mathematically and intuitively?

Thank you.

EDIT: The whole discussion was inspired by the $eq. (3.8)$ in this doc.

1

There are 1 best solutions below

8
On

I'll introduce a new letter: $\omega_k = 2\pi f_k $. In this case, we can write: $$ x(t) = \sum_{k=-N}^N a_k e^{i \omega_k t} $$ Now: $$ x(t) = x^* (t) $$ $$ \sum_{k=-N}^N a_k e^{i \omega_k t} = \sum_{k=-N}^N a_k^* e^{-i \omega_k t} $$ Because $\omega_{-k} = -\omega_k $, we can write this as $$ \sum_{k=-N}^N a_k e^{i \omega_k t} = \sum_{k=-N}^N a_k^* e^{i \omega_{-k} t} $$ If we replace the index in the right side of the equation to be $k'=-k$ we get $$ \sum_{k=-N}^N a_k e^{i \omega_k t} = \sum_{k'=-N}^N a_{-k'}^* e^{i \omega_{k'} t} $$ But $k,k'$ are just 'dummy variables', meaning variables of summation without real meaning. We can just as well use $k$ as the index of the right side: $$ \sum_{k=-N}^N a_k e^{i \omega_k t} = \sum_{k=-N}^N a_{-k}^* e^{i \omega_{k} t} $$ Now remember that $\{ e^{i \omega_k t} \}$ are linearly independent, meaning that $$\sum a_n e^{i\omega_n t} = \sum b_n e^{i\omega_n t} \Rightarrow \forall n : a_n=b_n $$ Here this implies $$a_k = a_{-k}^* $$