Is there a simple proof that $\cos(1)$ is transcendental?

203 Views Asked by At

Does anyone know a simple proof that $\cos(1)$ ($1$ in radians) is transcendental?

Without using the Lindemann-Weierstrass theorem and the like. That high school students can understand.

1

There are 1 best solutions below

4
On

We may mimic the proof of transcendence of $e$ by Hermite.

(Remark. Some formula seems not rendering properly under HTML-CSS renderer in Google Chrome browser. In such case, you may temporarily switch to other renderers such as SVG renderer.)

Step 1. Let $i=\sqrt{-1}$ denote the imaginary unit. Let $f(x)$ be a polynomial and define

$$ F(x) := \sum_{k=0}^{\infty} (-i)^k f^{(k)}(x). $$

Since $f$ is a polynomial, this is a finite sum. Also, note that

$$ e^{ix} \int_{0}^{x} e^{-it} f(t) \, \mathrm{d}t = i (F(x) - e^{ix}F(0)). \tag{1} $$

This is easily proved either by multiplying $e^{-ix}$ to both sides and differentiating both sides or by applying integration by parts repeatedly.

Step 2. If $\cos(1)=(e^{i}+e^{-i})/2$ is algebraic, then so is $e^{i}$. In light of this, it suffices to prove that $e^{i}$ is transcendental. To this end, we assume otherwise that there exist integers $a_0, \cdots, a_n$ such that

$$ \sum_{k=0}^{n} a_k e^{ik} = 0. \tag{2} $$

Writing the identity $\text{(1)}$ for $x = k$, multiplying by $a_k$, and summing over $k = 0, \cdots, n$, we get

$$ \sum_{k=0}^{n} a_k e^{ik} \int_{0}^{k} e^{-it} f(t) \, \mathrm{d}t = i \sum_{k=0}^{n} a_k F(k). \tag{3} $$

At this point, we still have a freedom to choose $f(t)$. Our goal is to choose a suitable $f(t)$ so that the left-hand side of $\text{(3)}$ is small while the right-hand side is a non-zero Gaussian integer (i.e., a complex number of the form $a+ib$ for $a, b \in \mathbb{Z}$.) This will lead to a contradiction, proving that no such integers $a_0, \cdots, a_n$ exist satisfying $\text{(2)}$.

Step 3. We follow Hermite's proof and pick $f(t)$ and $g(t)$ as

$$ f(t) := \frac{g(t)}{(p-1)!} \qquad \text{and} \qquad g(t) := t^{p-1}(t-1)^{p}\cdots(t-n)^{p}, $$

where $p$ is a prime number to be specified later. Then noting that $|g(t)| \leq n^{np+p-1}$ for all $t \in [0, n]$, we get

$$ \left| \text{[LHS of (3)]} \right| \leq \frac{n^{(n+1)p}}{(p-1)!} \sum_{k=0}^{n} |a_k|. \tag{4} $$

Since this bound converges to $0$ as $p\to\infty$, we can find a prime $p$ such that

$$p > \max\{n, |a_0|, \cdots, |a_n|\} \tag{5} $$

and the bound $\text{(4)}$ is strictly smaller than $1$. We fix such $p$.

Step 4. Now examine the right-hand side of $\text{(3)}$. In view of Taylor's Theorem

$$ f(t) = \sum_{j=0}^{\infty} \frac{f^{(j)}(k)}{j!} (t-k)^j, $$

for each $k = 0, \cdots, n$ and $ j \geq 0$, we get

\begin{align*} f^{(j)}(k) &= j! \times \text{[coefficient of $(t-k)^j$ in $f(t)$]} \\ &= \frac{j!}{(p-1)!} \times \text{[coefficient of $(t-k)^j$ in $g(t)$]}. \end{align*}

From this, we can make several useful observations:

  1. $f^{(p-1)}(0) = (-1)^p(-2)^p \cdots(-n)^p$.

  2. If $k \in \{1,\cdots,n\}$ and $j < p$, then $f^{(j)}(k) = 0$.

  3. If $k \in \{0,\cdots,n\}$ and $j \geq p$, then $f^{(j)}(k)$ is an integer divisible by $p$.

From this, it follows that

$$ \text{[RHS of (3)]} = i a_0 (-1)^p \cdots (-n)^p + \text{[Gaussian integer divisible by $p$]} \tag{6} $$

Now invoking the condition $\text{(5)}$, we find that $i a_0 (-1)^p \cdots (-n)^p$ is not divisible by $p$, and hence, $\text(6)$ is a non-zero Gaussian integer. Obviously, this implies that $\text{(6)}$ has modulus at least as large as $1$, which is the desired contradiction.