I have been watching a lecture about introduction to additive combinatorics. Starting from minute 16:00 or so, the speaker makes a claim which I do not understand. I included all the details here.
Given a function $f : \mathbb{Z}/n\mathbb{Z} \rightarrow \mathbb{C}$, it's Fourier transform at a point $r \in \mathbb{Z}/n\mathbb{Z}$ is defined by $$\hat{f}(r) = \mathbb{E}_{x \in \mathbb{Z}/n\mathbb{Z}} f(x) e^{2\pi i x/n} .$$
The claim is that if $A \subseteq \mathbb{Z}/n\mathbb{Z}$ is chosen at random, by picking every element of $\mathbb{Z}/n\mathbb{Z}$ to be in $A$ with probability $0.5$ and independently of other choices, then $\hat{1}_A(1) \approx 1/\sqrt{n}$. That is, the Fourier transform of the indicator function of $A$ at $1$ should be roughly $1/\sqrt{n}$. The way that I understand it is that the expectation of $\hat{1}_A(1)$ is roughly $1/\sqrt{n}$.
I don't understand why it shouldn’t be $0$. Here is my argument:
For every $a \in \mathbb{Z}/n\mathbb{Z}$ define the random variable $X_a = e^{2 \pi i a/n}$ if $a \in A$, and $X_a = 0$ otherwise. Then $\mathbb{E}[X_a] = \frac{1}{2}e^{2 \pi i a/n}$. Then,
$$\hat{1}_A(1) = \frac{1}{n} \sum _{a \in \mathbb{Z}/n\mathbb{Z}} X_a$$ and
$$\mathbb{E}[\hat{1}_A(1)] = \frac{1}{n} \sum_{a \in \mathbb{Z}/n\mathbb{Z}} \mathbb{E}[X_a] = \frac{1}{2} \sum_{a \in \mathbb{Z}/n\mathbb{Z}} e^{2 \pi i a/n} = 0$$
since the last step is just the sum of roots of unity.
Where am I wrong?
Your definition of the Fourier transform is suspect. It should be something like $$\hat{f}(r) = E_{x \in \mathbb{Z}/n\mathbb{Z}}f(x)\exp(2\pi i r x / n) = \frac{1}{n}\sum_{x = 0}^{n - 1}f(x)\exp(2\pi i r x / n).$$ I'll write $$\hat{1_A}(j) = \frac{1}{n}\sum_{j = 0}^{n - 1}1_A(k)w_n^{jk},$$ where $$w_n = \exp(2\pi i / n).$$ Then $$E(\hat{1_A}(j)) = \frac{1}{n}\sum_{j = 0}^{n - 1}\frac{1}{2}w_n^{jk} = \frac{1}{2}I(j = 0).$$ And \begin{align} \text{Cov}(\hat{1_A}(j), \hat{1_A}(k)) &= \frac{1}{n^2}\sum_{\ell = 0}^{n - 1}w_n^{(j - k)\ell}\text{Cov}(1_A(\ell), 1_A(\ell)) \\ &= \frac{1}{4n^2}\sum_{\ell = 0}^{n - 1}w_n^{(j - k)\ell} \\ &= \frac{1}{4n}I(j = k). \end{align} Hence $E(\hat{1_A}) = \frac{1}{2}e_0$ and the covariance matrix is $\frac{1}{4n}I$. So $\hat{1_A}(1)$ is mean $0$ with variance $\frac{1}{4n}$.