Induced endomorphism of the Torus by a matrix A without root of unity is mixing

187 Views Asked by At

Exercise 7.1.4 of the book "Foundations of Ergodic Theory" propose the following problem:

Show that if no eigenvalue of $A ∈ SL(d,\mathbb{R})$ is a root of unity then the linear endomorphism $f_{A} :\mathbb{T}^{d}\rightarrow \mathbb{T}^{d}$ induced by A is mixing, with respect to the Haar measure.

My Aproach

Consider the basis of Fourier functions $\phi_{k}([x])=e^{2\pi \ ik\cdot x})$, for each $k\in\mathbb{Z}^{d}$, $[x]\in\mathbb{T}^{d}$

By Proposition (7.1.7) it is sufficient to show that for $k,l\in\mathbb{Z}^{d}$

$\lim_{n\to\infty}[(\phi_{k}\circ f_{A}^{n})\cdot\phi_{l}-(\phi_{k}\cdot 1)(\phi_{l}\cdot 1)]=0$, where $"\cdot"$ means the inner product on $L^{2}(\mathbb{T}^{d},m)$

It is easy to see that

$$\phi_{k}\circ f_{A}=\phi_{A(k)}$$

On the other hand the secuence $A^{n}(k)$ is inyective for $k$ different to $0$. Anyway, how can I finish the proof?