Proof of The Associative Law and The Commutative Law.

19.1k Views Asked by At

The associative law of multiplication for three positive integers $a,b$ and $c$ can be proved$^1$ from the Commutative Law and the property of "Number of things" easily.
We can prove$^2$ the associative law of multiplication for four positive integers $a,b,c$ and $d$ from the associative law of of multiplication of three positive integers and the property of "Number of things". We can also prove$^2$ that the multiplication of three positive integers is independent of the arrangement of the individual positive integers in the expression.

  • Can we prove$^3$ associative law of multiplication for $n$ numbers formally in some way?
  • Can we prove that the multiplication $a_1a_2a_3\cdots a_n$ is independent of the order in which the individual numbers are arranged e.g. $a_1a_2a_3\cdots a_n=a_1a_na_2a_3\cdots a_{n-1}$.$(a_n \in \mathbb{N}$ for every n), that is the commutative law?

This portion is additional, added for the sake of clearness, may be ommited if seems to be a discussion
$^1$ $abc = c$ sums such as $(a + a + \cdots$ to $b$ terms).
By commutative law $(a + a + \cdots$ to $b$ terms)=$(b + b + \cdots$ to $a$ terms).
So $(ab)c=(ba)c$.
$(ba)$ is the sum of $b$ numbers, each of which is $a$.
$(ba)c$ is the sum of $c$ numbers, each of which is $(ba)$, that is $(ba)c=(ba+ba+ba+\cdots$ to $c$ terms). $(ba)$ in $(ba)c$ can be considered as a coloumn of $a$ numbers each of which is $b$, i.e. $(ba)c= \begin{bmatrix}b\\b\\b\\.\\.\\.\end{bmatrix}_{a\times 1}+ \begin{bmatrix}b\\b\\b\\.\\.\\.\end{bmatrix}_{a\times 1}+\begin{bmatrix}b\\b\\b\\.\\.\\.\end{bmatrix}_{a\times 1} \cdots \begin{bmatrix}b\\b\\b\\.\\.\\.\end{bmatrix}_{a\times 1}$ upto $c$ times.
This can be rearranged as:
$(ba)c= {\begin{bmatrix}{\begin{bmatrix}b&b&b&.&.&.\end{bmatrix}_{1\times c}}\\{\begin{bmatrix}b&b&b&.&.&.\end{bmatrix}_{1\times c}}\\{\begin{bmatrix}b&b&b&.&.&.\end{bmatrix}_{1\times c}}\\.\\.\\.{\begin{bmatrix}b&b&b&.&.&.\end{bmatrix}_{c\times 1}}\end{bmatrix}_{a\times 1}}={\begin{bmatrix}bc\\bc\\bc\\.\\.\\.\\bc\end{bmatrix}}_{a\times1}=(bc+bc+bc\cdots a$ times) =$(bc)a=a(bc).$
Although associative law for three numbers can also be proved by considering $(ab)c$ as a surface $(ab)$(having $b$ rows of 1's and $a$ coloumns of 1's) repeated $c$ times in the 3rd dimension but I cannnot put it in latex so I chose to prove it as I did.


$^{2}$ $abcd = (ab)(c)(d) = (ba)(c)(d)=bacd$
$abcd= (ab)(c)(d)=(c)(ab)(d)=cabd$
$abcd=(c)(ab)(d)=(c)(ba)(d)=cbad$ similarly all other orders can be formed.


$^3$ Please do not use set-theory to prove the associative law. I am using numbers for concrete things.

2

There are 2 best solutions below

9
On

Yes. Prove, by induction on $n$, that any product of $n$ factors, no matter how it's parenthesized, agrees with the product of the same factors, in the same order, parenthesized "to the left", i.e., $((\dots((a_1a_2)a_3)\dots a_{n-2})a_{n-1})a_n$. You already know the first nontrivial case, $n=3$, so you only need to do the induction step.

4
On

To me, one problem here is that you have not rigorously defined multiplication. As far as I am concerned, anything that uses "..." is hand waving.

Also, what is your definition of "number of things"? Please give examples.