Alternative definition for Exponential and Logarithmic functions to prove identities (and by extension sin and cos related identities)

670 Views Asked by At

Edit : This question is about not about proving identities, but representations that are easier to work with than Taylor series or integral definition for the functions exp or ln functions. Please do not attempt to prove identities, just want alternative initial representations. For example the infinite zeros of sin are not obvious with the Taylor series definitions, where as Euler Product formula $\sin(x) = x\prod_{n=1}^\infty \left(1-\frac{x^2}{n^2\pi^2}\right)$ makes it trivial to see sin,cos have infinite many zeros on the real line.

Are there better alternative definitions than $\exp(x) = {\large\sum\limits_{k=0}^\infty} \dfrac{x^k}{k!} , \ln(x) = {\large\int_1^x} \dfrac1t\ dt$. that can be used for derivation of their identities e.g. $\exp (x+y)=\exp(x)\exp(y)$, $\ln (xy) =\ln (x)\ln(y)$ (and by extension sin and cos related) identities?

Is there any other starting point other than $\exp(x) = {\large\sum\limits_{k=0}^\infty} \dfrac{x^k}{k!}$ , where this definition of exponential function is obtained as one of the possible representations of exponential function? If yes, what body of theory deals with these type of questions (if any).

3

There are 3 best solutions below

0
On

I'm pretty partial to using Hyperbolic Functions, so here's one mildly obtuse way to manipulate alternative defintions of the exponential function.

Since $\sinh(x) = \dfrac{e^x - e^{-x}}{2}$ and $\cosh(x) = \dfrac{e^x + e^{-x}}{2}$ (which, admittedly, is a bit circular for your question's sake...but whatever), it is easy to show that $e^{\pm x} = \cosh(x) \pm \sinh(x)$.

So what's the point? Well, you also probably have to know the analogy to the great-grand-daddy of all trig identities with $\cos^2(x) + \sin^2(x) = 1$:

$$\cosh^2(x) - \sinh^2(x) = 1.$$

Okay, now that you've got that, you can go right into manipulating expressions in a very, very similar way to the way you would prove identities involving sines and cosines.

I don't feel like I've answered the question in the scope that you request, but I hope this provides you with a different perspective on using alternative defintions to the exponential.

3
On

Well, one way to see the exponential is to realize the following:

There are two canonical group structures in $\mathbb{R}$: $(\mathbb{R},+)$ and $(\mathbb{R}_{>0}, \cdot)$. The exponential is, together with the identity map, the only two "interesting" functions*: the intersection of algebra and analysis - The identity map is the unique continuous isomorphism from the additive structure to itself that makes $f(1)=1$ and the exponential map is the unique continuous isomorphism from the additive structure to the multiplicative structure that makes $f(1)=e$ (the $e$ here just enters as a suitable distinguishing point). This is a good way to think about things, but as I said in the comments, is not nearly as workable as the series definition. And I think no other definition will be. Power series are simple, well-understood, highly computational etc. (See below also for the "differential equation definition")

*-In fact, all other elementary functions can be obtained by them: polynomials, $\sin$, $\cos$, $\log$ etc.


This question admits a lot of different answers. I'll approach one of the problems you mentioned: proving $\exp(x+y)=\exp(x)\exp(y)$. One can arrive at this from the Cauchy product formula, but also from the simple fact that $\exp'(x)=\exp(x)$ and $\exp(0)=1$ (which is something that follows easily from the power series representation, and can also be taken as the definition of the exponential, that is, the function that solves the differential equation $y'=y$ with initial condition $y(0)=1$).

For that, define, for fixed $y \in \mathbb{R}$, $$f(x)=\frac{\exp(x+y)}{\exp(x)\exp(y)}.$$

We then have $$f'(x)=\frac{\exp(x)\exp(y) \exp(x+y)-\exp(x+y)\exp(x)\exp(y)}{\exp(x)^2\exp(y)^2}=0,$$ by the quotient rule. Therefore, $f(x)$ is constant. Since it is clear that $f(0)=1$, we have that $\exp(x+y)=\exp(x)\exp(y)$ for all $x$. But $y$ is also arbitrary, hence we have the equality for all $x,y \in \mathbb{R}$.

0
On

When I learned the definitions of these functions, we defined $\ln(x) = {\large\int_1^x} \dfrac1t\ dt$, from which we were able to derive the property $ln(ab)=ln(a)+ln(b)$ as follows $$ln(ab)=\ln(x) = {\int_1^{ab}} \dfrac1t\ dt= \int_1^{a} \dfrac1t\ dt +\int_a^{ab} \dfrac1t\ dt $$

letting $t=ax$ $$\int_1^{a} \dfrac1t\ dt +\int_1^{b} \dfrac1{ax}\ d(ax)=ln(a)+ln(b)$$

Then, you can define $e^x$ as the inverse of $ln(x)$ and take the natural log of $e^ae^b$ $$ln(e^ae^b)=ln(e^a)+ln(e^b)=a+b$$ now undo the natural log using the exponential and you get $e^{a}e^b=e^{a+b}$ This gives a helpful primer on the exponential before continuing on to the hyperbolic functions in Xoque55's answer