Show by an example that in computer arithmetic a + (b + c) may differ from (a + b) + c
Here is what I am thinking, but I am not really sure. Maybe someone can lead me in the right direction or tell me if my answer is sufficient.
What I was thinking was it is true that:
$a + (b + c) = (a + b) + c$ in normal arithmetic:
For example: $5 + (-2 + 7) = 10$ & similarly $(5 - 2) + 7 = 10$
Yet in computer arithmetic this isn't the case.. For example:
$\infty + (- \infty + 1) = 0$ But here we find that $(\infty + - \infty) + 1 = 1$
$\therefore$ in regards to computer arithmetic $a + (b + c) \ne (a + b) + c$
For a case where $a+(b+c) \ne (a+b)+c $, consider a case when $b+c$ overflows but $a = -b$ so $(a+b)+c = c$ which is a normal result.