Proof of the identity $\sin(x+y) =\sin(x)\cos(y) + \cos(x)\sin(y)$ for all $x$ and $y$

10.7k Views Asked by At

I don't know if I'm asking for too much, but the proofs I've seen of the statement

$$\sin(x+y) =\sin(x)\cos(y) + \cos(x)\sin(y)$$

consist of drawing a couple of triangles, one on top of each other and then figuring out some angles and lengths until they arrive at the identity.

And I agree with the proof, is just that, even by flipping the triangle around, it only proves the identity for the case $x+y<\pi/2$, or if it does prove it for all values of $x$ and $y$, I wouldn't understand why.

As to construing a proof by using Euler's identity or the derivatives of sin and cos, I would ask the writer to first prove his/her already accepted formulas without using the addition identity.

So that is my humble question. How could one prove that for all the values of $x$ and $y$, the identity $\sin(x+y) = \sin(x)\cos(y) + \cos(x)\sin(y)$ holds.

Any thoughts/ideas would be really appreciated.

2

There are 2 best solutions below

0
On BEST ANSWER

My favorite proof is based on transformation matrices. If you want to rotate a point $(x,y)$ counter-clockwise around the origin by $t$ radians, you can use matrix multiplication:

$$\begin{bmatrix}\cos t & -\sin t \\ \sin t & \cos t \end{bmatrix}\begin{bmatrix}x \\ y \end{bmatrix}$$

The product will be the coordinates of the newly rotated point.

So, suppose you want to rotate $(x,y)$ by $a + b$ radians. You could either do this in one go, or you could first rotate by $a$ radians and then by $b$. Either way, of course, you should end up with the same point. In other words,

$$\begin{bmatrix}\cos (a+b) & -\sin (a+b) \\ \sin (a+b) & \cos (a+b) \end{bmatrix}\begin{bmatrix}x \\ y \end{bmatrix}$$ and $$\begin{bmatrix}\cos b & -\sin b \\ \sin b & \cos b \end{bmatrix}\begin{bmatrix}\cos a & -\sin a \\ \sin a & \cos a \end{bmatrix}\begin{bmatrix}x \\ y \end{bmatrix} = \begin{bmatrix}\cos a \cos b - \sin a \sin b & -\sin a \cos b - \cos a \sin b\\ \cos a \sin b + \sin a \cos b & -\sin a \sin b + \cos a \cos b\end{bmatrix}\begin{bmatrix}x \\ y \end{bmatrix}$$

should give us the same products. This requires that the two matrices must be equal, so $\sin(a+b) = \cos a \sin b + \sin a \cos b$ (and also, $\cos(a+b) = \cos a \cos b - \sin a \sin b$!).

0
On

As a reference, consider page $125$ of these notes. By defining the complex exponential function through the everywhere-convergent series $$e^z\stackrel{\text{def}}{=}\sum_{n\geq 0}\frac{z^n}{n!} $$ it is simple to check that $e^{z}\cdot e^{w} = e^{z+w}$: $$ e^{z+w}=\sum_{n\geq 0}\frac{1}{n!}\sum_{k=0}^{n}\binom{n}{k}z^k w^{n-k}=\sum_{a,b\geq 0}\frac{z^a}{a!}\cdot\frac{w^b}{b!}=e^{z}\cdot e^{w}. $$ In particular, for any $\theta\in\mathbb{R}$ the squared modulus of $e^{i\theta}$, given by $e^{i\theta}\cdot\overline{e^{i\theta}} = e^{i\theta}\cdot e^{-i\theta}$, equals $1$. Since $\frac{d}{dz}e^z=e^z$ is a trivial consequence of the series definition, we have that the map $\theta\to e^{i\theta}$ is an arc-length parametrization of the unit circle and the functions defined by $$\sin\theta = \text{Im }e^{i\theta},\qquad \cos\theta=\text{Re }e^{i\theta}$$ are the elementary trigonometric functions we already know and the statement $\left\|e^{i\theta}\right\|^2=1$ is equivalent to the Pythagorean Theorem. The interesting consequence is that:

$$\begin{eqnarray*}\sin(\theta+\varphi) = \text{Im}\left(e^{i\theta}\cdot e^{i\varphi}\right)&=&\text{Im}\left[\left(\cos\theta+i\sin\theta\right)\cdot\left(\cos\varphi+i\sin\varphi\right)\right]\\&=&\sin\theta\cos\varphi+\sin\varphi\cos\theta.\end{eqnarray*} $$


As an alternative approach, one may notice that for a fixed value of $\varphi$ both $f(\theta)=\sin(\theta+\varphi)$ and $g(\theta)=\sin\theta\cos\varphi+\cos\theta\sin\varphi$ are solutions of the differential equation

$$ f''+f=0,\quad f(0)=\sin\varphi,\quad f'(0)=\cos\varphi $$

hence they are the same function by the unicity part of the Cauchy-Lipschitz Theorem.