Why we can't define $\frac{1}{0}$ to be $1$ (or anything else), but we can define $1^0$ to be $1$?

268 Views Asked by At

We know that we can't define division by zero "in any mathematical system that obeys the axioms of a field", because it would be inconsistent with such axioms.

(1) Why can we define $a^0$ ($a\neq 0$) to be $1$? Is it possible to prove that such definition is consistent with any rule of arithmetic? How to conclude that to define $a^0$ ($a\neq 0$) we don't need abolish any other basic rule of arithmetic?

(2) More generally, how to know if a definition is consistent with a given mathematical theory?

2

There are 2 best solutions below

2
On

There is no general algorithm for determining when a theory is consistent. That is a huge topic which includes Godel's incompleteness theorems. But your specific question is easier.

In Peano arithmetic (with axioms stated using $+,\times$) an exponential function $x^y$ can be defined by recursion $x^0=1$ and $x^{s(y)}=x\times x^{y}$. The axioms prove that functions can be defined recursion. So if you believe (as nearly everyone does) that Peano arithmetic (with axioms stated using $+,\times$) is consistent, then you must believe the extension with that exponential function is consistent.

Since your question mentions basic rules of arithmetic I answered in terms of Peano Arithmetic. If you merely want consistency with the field axioms the question is simpler yet: The field of integers modulo 2 proves consistency of those axioms plus $x^1=x$ and $x^0=1$, by giving a finite model. But this includes very little of arithmetic and notably does not include $x^{(y+z)}=x^y\times x^z$. See "finite field" on Wikipedia.

12
On

We don't actually define $1^0$ to be 1. That's its value. Likewise $0^0=1$ is a derived value, the supposed indefinite values all rely on the same $0/0$ proof that lets $0=1$. If you take the limit of $x^{ax}$ as x-> 0, the limit is 1 for all a.

When you ask how one plans to go towards zero, a step right is either by way of root, eg square-root, or by way of division.

Roots are a matter of dividing a positive number by a positive number, and this never goes to zero.

Division implies $0/0$ is being used, ie to suppose $0^0=0$, implies that you can reach $0$ by division of non-zero numbers, or that you can reach $0$ by division. Since the first is not accepted in maths, it implies that $0^0=0$ arises from division by zero.