How to compute the fundamental period of a linear combination of harmonically related complex exponentials?

74 Views Asked by At

Given reals numbers $a_k$ for $k\in\{1,\ldots,N\}$, how do I prove that the fundamental period (the smallest positive $T>0$ for which $x(t)=x(t+T)$) of

$x_N(t)=\sum_{k=1}^{N}a_k e^{j\omega_0 k t}$

is $T=\frac{2\pi}{\omega_0}$ when $a_1\neq 0$?

This question is identical to this one, but the answer there does not help since I am not looking for "a period" but rather the fundamental period.

EDIT: The question follows from my reading of the book "Signals and Systems" (Prentice Hall, 1996) by Alan V. Oppenheim, Alan S. Willsky, and S. Hamid, which states the following in pages 186-187.

"A linear combination of harmonically related complex exponentials of the form $$x(t)=\sum_{k=-\infty}^{+\infty}a_k e^{jk\omega_0t}=\sum_{k=-\infty}^{+\infty}a_k e^{jk(2\pi/T)t}$$ is also periodic with period $T$."

It is easy to verify that $T$ is a period of $x(t)$, and even though it seems quite obvious that $T$ should also be the fundamental period, I am not quite able to put together a proof of that. So far, here's the closest I have gotten to a proof. To be clear, I will consider only the problem of proving that $x_N(t)$ is periodic with fundamental period $T$ when $a_1\neq 0$.

Suppose that $x_N(t)$ is periodic with period $\tau>0$. My goal is to show that the smallest $\tau$ satisfying $x_N(t)=x_N(t+\tau)$ for all $t\in\mathbb{R}$ is equal to $T$. Since $x_N(t)=x_N(t+\tau)$ for all $t\in\mathbb{R}$ and $x_N$ is smooth (it is continuously differentiable everywhere up to an arbitrary order), then $$\begin{gather*} x_N(t) =x_N(t+\tau)\\ x_N'(t) =x_N'(t+\tau)\\ \vdots\\ x_N^{(N-1)}(t)=x_N^{(N-1)}(t+\tau) \end{gather*}$$ and, in particular, we have that $$\begin{gather*} x_N(0) =x_N(\tau)\\ x_N'(0) =x_N'(\tau)\\ \vdots\\ x_N^{(N-1)}(0)=x_N^{(N-1)}(\tau) \end{gather*}$$

Since $$x_N^{(n)}(t)=\sum_{k=1}^{N}a_k(jk\omega_0)^ne^{j\omega_0kt}$$ it follows that $$\underbrace{\begin{bmatrix} 1 & 1 & \ldots & 1\\ j\omega_0 & 2j\omega_0 & \ldots & Nj\omega_0\\ (j\omega_0)^2 & (2j\omega_0)^2 & \ldots & (Nj\omega_0)^2\\ \vdots & \vdots & \ddots & \vdots\\ (j\omega_0)^{N-1} & (2j\omega_0)^{N-1} & \ldots & (Nj\omega_0)^{N-1} \end{bmatrix}}_{M}\begin{bmatrix} (1-e^{j\omega_0\tau})a_1\\ (1-e^{2j\omega_0\tau})a_2\\ \vdots\\ (1-e^{Nj\omega_0\tau})a_N \end{bmatrix}=0.$$ If $M$ is nonsingular then the desired result follows, but how do I show that? Also, it seems that there could be a much more obvious answer to my question.

EDIT 2: The matrix $M$ can be decomposed as follows. $$M=\text{diag}([1\ j\omega_0\ \ldots\ (j\omega_0)^{N-1}]^\top)\underbrace{\begin{bmatrix} 1 & 1 & \ldots & 1\\ 1 & 2 & \ldots & N\\ 1 & 2^2 & \ldots & N^2\\ \vdots & \vdots & \ddots & \vdots\\ 1 & 2^{N-1} & \ldots & N^{N-1} \end{bmatrix}}_{S}$$ where $\text{diag}(v)$ is a $N\times N$ diagonal matrix, i.e., a square matrix whose diagonal entries are the elements of $v\in\mathbb{R}^N$ and every other entry is equal to $0$. It is immediately obvious that $M$ is singular when $\omega_0=0$, in which case $x_N$ does not have a fundamental period. If $\omega_0\neq 0$, then $M$ is nonsingular if and only if $S$ is nonsingular.

Through computations, I found out that $S$ can be decomposed as $S=LDU$ (for all $N$ up to 11 at least), where $L$ is a lower triangular matrix $L=[\ell_{mn}]_{1\leq m,n N}$ where $\ell_{mn}$ is the Stirling coefficient of the 2nd kind $S(m,n)$, $D=[d_{mn}]_{1\leq m,n\leq N}$ is a diagonal matrix where the entries of the diagonal are given by the Pochhammer symbol $d_{nn}=\text{Pochhammer}(1,m-1)$ for each $1\leq n\leq N$, and $U=[u_{mn}]_{1\leq m,n\leq N}$ is an upper triangular matrix satisfying $u_{mn}=\Gamma(m,n-m+1)/\Gamma(m,1)$ with $$\Gamma(m,k)=\Pi_{i=1}^{m-1}(k+i-1).$$

For example, for $N=4$ we have

$$\underbrace{\begin{bmatrix} 1 & 1 & 1 & 1\\ 1 & 2 & 3 & 4\\ 1 & 4 & 9 & 16\\ 1 & 8 & 27 & 64 \end{bmatrix}}_{S}=\underbrace{\begin{bmatrix} 1 & 0 & 0 & 0\\ 1 & 1 & 0 & 0\\ 1 & 3 & 1 & 0\\ 1 & 7 & 6 & 1 \end{bmatrix}}_{L}\underbrace{\begin{bmatrix} 1 & 0 & 0 & 0\\ 0 & 1 & 0 & 0\\ 0 & 0 & 2 & 0\\ 0 & 0 & 0 & 6 \end{bmatrix}}_{D}\underbrace{\begin{bmatrix} 1 & 1 & 1 & 1\\ 0 & 1 & 2 & 3\\ 0 & 0 & 1 & 3\\ 0 & 0 & 0 & 1 \end{bmatrix}}_{U}$$

I have only tested this conjecture numerically, but if it holds for all $N$, then this would answer my question. However, it does leave open the questions: why does $S$ have this peculiar decomposition? Are there simpler answers to my original question?