I am learning about the Discrete Fourier Transform from this pdf, from page $167$ (pdf page - the book page is $152$) which approaches the topic with a linear algebraic perspective.
Preliminaries:
First they define $\Bbb{Z}/n$ as the quotient set of $\Bbb{Z}$ under the following equivalence relation $\sim$:
$$j\sim j'\iff\frac{j-j'}{n}\in\Bbb{Z}$$
And then they define $\ell^2(\Bbb{Z}/n)$ as the set of functions from $\Bbb{Z}/n\to\Bbb C$, and $\ell^2_{\Bbb R}$ similarly but with the reals as the codomain. Side question: is this $\ell^2$ notation standard? I thought $\ell$ denote sequence spaces...
They endow $\ell^2(\Bbb Z/n)$ with the inner product $\frac{1}{n}\sum_{k\in\Bbb{Z}/n}f(k)\overline{g(k)}=\frac{1}{n}\sum_{k=0}^{n-1}f(k)\overline{g(k)}$. They define the translation operator (I know it as the "shift" operator) $T$ to be such that: $T(f(k))=f(k+1)$. They then state (paraphrasing):
$T$ is unitary on $\ell^2(\Bbb{Z}/n)$ and therefore orthogonal on $\ell^2_{\Bbb{R}}(\Bbb Z/n)$ $(1)$. This implies that $\ell^2(\Bbb{Z}/n)$ has an orthogonal basis of eigenvectors of $T$. $(2)$
I don't want to place two different questions in the same question, but they are all part of the same two pages on the paper. I fail to see why $(1)$ and $(2)$ hold. They are clear in the case of matrices but this is more abstract linear algebra - I don't think I can make the simple argument that a complex unitary matrix is evidently orthogonal when it is composed of reals, as the complex conjugate of a real number is itself.
Main question:
Defining $\omega=\exp(2\pi i/n)$, they assert that $\omega^j:j\in\Bbb{Z}/n$ is an eigenvalue of $T$ with $f(k)=\omega^{jk}$ as its eigenvectors.
I believe this is justified only because $j,k$ are integers - a brief complex analysis of $(\exp(2\pi i/n)^j)^k$ showed me that, but please do correct me if I'm wrong!
The modular-exponential issue I have now:
$$e_j(k)=\omega^{jk}\\\langle e_j,e_i\rangle=\frac{1}{n}\sum_{k=0}^{n-1}\omega^{k(j-i)}=\begin{cases}1&j=i\\0&j\neq i\end{cases}\tag{1}$$This is seen by the observation: $$\omega^m\sum_{k=0}^{n-1}\omega^{mk}=\sum_{k=0}^{n-1}\omega^{m(k+1)}=\sum_{k=0}^{n-1}\omega^{mk}\tag{2}$$ Since $k+1$ runs once over $\Bbb Z/n$ as $k$ does.
Statement $(2)$ makes sense to me; however, its application in understanding statement $1$ is not. I do not know enough about exponentiating in a modular set to understand why the sum equals $0$ when $j\neq i$ but $1$ when $j=i$. The multiplication by $1/n$ in particular is throwing me, since $(1/n)\notin\Bbb Z/n$.
You can think of $\ell^2(\mathbb{Z}/n)$ as $\mathbb{C}^n$ by associating $f \in \ell^2(\mathbb{Z}/n)$ with $(f(0), f(1), \ldots, f(n-1))$ for instance. The inner product ends up being the same as the usual inner product on $\mathbb{C}^n$ (scaled by $1/n$). This ignores the modular structure temporarily, but can be remedied easily when needed (see how $T$ gets represented below).
For $n=4$, the matrix representation of $T$ is $\begin{bmatrix}&&&1\\1\\&1\\&&1\end{bmatrix}$. If you understand why the statements hold for matrices, I think you are fine; I don't think there is anything deeper going on here.
There is nothing really fancy going on in the last two equations. It may help to just restrict $i,j$ to $\{0,1,\ldots,n-1\}$ and ignore the modular aspect. When $j = i$, we have $\frac{1}{n} \sum_{k=0}^{n-1} \omega^0 = 1$. When $j \ne i$, applying (2) implies $\omega^{j-i} \sum_{k=0}^{n-1} \omega^{k(j-i)} = \sum_{k=0}^{n-1} \omega^{k(j-i)}$; since $\omega^{j-i} \ne 1$, the sum must be zero.
The modularity in the exponentiation is baked into the definition of $\omega$ already, since $\omega^j = \omega^{j'}$ whenever $j \sim j'$.