A question concerning an exercise from Tao Vu

174 Views Asked by At

This is the exercise 4.1.5 from Tao Vu Additive Combinatorics.

$Z$ is a finite additive group with a fixed symmetric non-degenerate bilinear form $\cdot$

Define $e: \mathbb{R}/\mathbb{Z} \to \mathbb{C}$ by $e(\theta) := e^{2 \pi i \theta}$

Define $e_\xi: Z \to \mathbb{C}$ by $e_\xi (x) := e(\xi \cdot x)= e^{2 \pi i \xi \cdot x}$

$e_\xi$ is called a character.

Let $x$ be an element of $Z$ chosen uniformly at random. Show that the random variables $\{e_{\xi} (x): \xi \in Z\}$ are pairwise independent, and have variance $1$ and mean $0$ for $\xi \ne 0$ and mean $1$ for $\xi = 0$.

Now I don't believe the independence statement to be true. I have the following example in my mind:

Let $Z = \mathbb{Z}/5\mathbb{Z}$, the bilinear form be $x \cdot y := xy/5$. And characters be $2$ and $3$. Now consider $\mathbb{P} (e_2(x) = e(1), e_3(x) = e(1))$. Clearly, it equals $0$, but by independence requirement we should have $\mathbb{P} (e_2(x) = e(1)) \mathbb{P} (e_3(x) = e(1)) = \frac{1}{25}$.

What is my mistake or misinterpretation? Or maybe this exercise is not correctly stated?

2

There are 2 best solutions below

3
On BEST ANSWER

I think that what they want you to show is $\int_{\mathbb{R}/\mathbb{Z}} e_i e_j d x = 0,$ for $i\neq j,$ which, as you say, is not the same as independence.

0
On

I think the authors made a mistake when they wrote this exercise; there is only one standard definition of independent random variables, and the $e_\xi$ do not satisfy it.

We can still do the exercise if we reinterpret the intent of the authors: the exercise uses equation (1.9), which holds if the variables $X_i$ are pairwise independent. It also holds if they have zero covariance, and the $e_\xi$ have zero covariance.

This is similar to what Igor Rivin said where you take $\sum_{x \in Z} e_\xi(x) \overline{e_{\xi'}(x)}$; because most of the $e_\xi$ have mean zero, this sum is basically the covariance (it's the covariance times $|Z|$).

I'll write out my solution just for completion's sake. It's actually a little more awkward than just using the orthogonality, because of the pesky constant function $e_0$. For any $f: Z \to \textbf{C}$ we want to show $$\textbf{E} _Z |f|^2 = \sum_{\xi \in Z} |\hat{f}(\xi)|^2$$ where $\hat{f}(\xi) = \textbf{E}_Z f e_{\xi}$. Equation (4.4) says $f = \sum_{\xi \in Z} \hat{f}(\xi)e_\xi$. This equation becomes $f = c + X$ when we change notation as follows: observe that $\hat{f}(0)e_0$ is a constant $c \in \textbf{C}$, define $X_i$ to be the remaining $\hat{f}(\xi)e_\xi$ for $i=1, \ldots, n$, and further define $X = \sum_{i=1}^n X_i$. Now we achieve our goal by computation: \begin{align*} \textbf{E}_Z|f|^2 &= \textbf{E}_Z (c + X)\overline{c + X} \\ &= \textbf{E}_Z |X|^2 + \overline{c}\textbf{E}_Z X + c \textbf{E}_Z\overline{X} + |c|^2. \end{align*} Since $X_i = f(\xi)e_\xi$ for some $\xi \neq 0$, each $X_i$ has mean zero, and therefore the sum $X$ has mean zero. We can cancel the above equation to \begin{equation}\textbf{E}_Z |f|^2 = \textbf{E}_Z |X|^2 + |c|^2. \tag*{$(*)$} \end{equation} I claim \begin{align*} \textbf{E}_Z |X|^2 &= \text{Var}(X) \\ &= \sum_{i=1}^n \text{Var}(X_i) \\ &= \sum_{\xi \neq 0} |\hat{f}(\xi)|^2. \end{align*} The first equality follows because $X$ has mean zero, and the second by (1.9) (as I said earlier, (1.9) holds if $X_i$ have zero covariance; they do, because the $e_\xi$ have zero covariance). For the third, note $\text{Var}(e_\xi) = 1$ for nonzero $\xi$, so for each $X_i$ we have $\text{Var}(X_i) = \text{Var}(\hat{f}(\xi)e_\xi) = |\hat{f}(\xi)|^2$ for some nonzero $\xi$; because the $X_i$ correspond to nonzero $\xi$, this confirms the third equality. We substitute this chain of equalities into $(*)$, and also substitute $\hat{f}(0) = c$, getting us to \begin{align*}\textbf{E}_Z |f|^2 &= \sum_{\xi \neq 0} |\hat{f}(\xi)|^2 + |\hat{f}(0)|^2 \\ &= \sum_{\xi \in Z} |\hat{f}(\xi)|^2. \tag*{$\square$} \end{align*}

I'll tell the authors about the error sometime soon.