Show by an example that in computer arithmetic a + (b + c) may differ from (a + b) + c

1.1k Views Asked by At

Show by an example that in computer arithmetic a + (b + c) may differ from (a + b) + c

Here is what I am thinking, but I am not really sure. Maybe someone can lead me in the right direction or tell me if my answer is sufficient.

What I was thinking was it is true that:

$a + (b + c) = (a + b) + c$ in normal arithmetic:

For example: $5 + (-2 + 7) = 10$ & similarly $(5 - 2) + 7 = 10$

Yet in computer arithmetic this isn't the case.. For example:

$\infty + (- \infty + 1) = 0$ But here we find that $(\infty + - \infty) + 1 = 1$

$\therefore$ in regards to computer arithmetic $a + (b + c) \ne (a + b) + c$

2

There are 2 best solutions below

2
On BEST ANSWER

For a case where $a+(b+c) \ne (a+b)+c $, consider a case when $b+c$ overflows but $a = -b$ so $(a+b)+c = c$ which is a normal result.

2
On

In floating point you can have $b$ smaller than the resolution of $a$ (meaning that it is too small to change the last bit in the mantissa of $a$) and hence get $a+b=a$, if you take $b=c$ then $(a+b)+c=(a+b)+b=a+b=a$. But if $b+c$ is not smaller than the resolution of $a$ then $a+(b+c)\neq a$.

Actual number depend on the language, floating point type and operating system.