Today, a friend gave me a "proof" of $1=2$ and challenged me to find the fallacy.
$1 = 1$
$1 = 1 + 0 + 0 + 0 ...$
$1 = 1 + 1 - 1 + 1 - 1 + 1 - 1 ...$
$1 = 2 - 1 + 1 - 1 + 1 - 1 ...$
$1 = 2 + 0 + 0 ...$
$1 = 2$
My answer was that once you turn the initial $1 + 1$ into a 2, everything is offset so a $-1$ is always left at the end no matter how many times it is repeated. This negative one balances out the $2$ at the beginning so $1=1$ still holds true. I.e.
$$1 = 1 + 0 + 0 + 0 ... = 1 + (1 - 1) + (1 - 1) + (1 - 1) = 2 + (-1 + 1) + (-1 + 1) - 1$$
However, my friend claimed that my answer only applies if the $+ 1 - 1$ repeats for a finite number of times. He argues that because the sequence repeats infinitely and things work differently when working with infinity, my answer is not valid.
Can anyone enlighten me to the true fallacy in this proof?
Some users pointed me to read up on converging and diverging series.
As I currently understand it, equating $1+0+0+0... = 1+1−1+1−1+1−1...$ is the fallacy because the series on the right does not converge (much less to 1) - therefore, they are not equal.
To prove that this is the fallacy, we can use convergent tests to show that the two sides of the equation are not equal.
Am I correct in my deduction?