I'm rediscovering my childhood love of mathematics after many, many years away and looking to rebuild my skills from the ground up, starting with fundamental arithmetic. I'm working with a text that I enjoy, and it makes a point of demonstrating the ways one algorithm can provide a foundation for another.
I have become very enamored of this idea of rigorously proving each algorithm; I find a simple beauty in proofs, and the absolute mastery of a concept that comes from such rigorous examination is far more appealing to me as a hobbyist than breadth of knowledge. This text glosses over rigorous proofs of the most basic arithmetic algorithms such as addition, subtraction, etc. While I have found some intriguing examples of such proofs on this board and can with great effort wrap my mind around them, I find that I am in no position to attempt to articulate such a thing on my own, and much of the notation used is unfamiliar to me.
What sort of material would I need to study to have the necessary knowledge to devise rigorous proofs of fundamental algorithms as in the above link? Is it possible to learn the necessary mathematical logic independent of the arithmetic/algebra? If so, which texts would you recommend?
Let me be absolutely clear: I am not looking for proofs themselves, but rather the knowledge necessary to devise such proofs on my own. I also understand that at some point, one has to accept certain ideas as axiomatic to be able to begin an inductive process, but I wish to learn how to make that distinction in an educated way. Thank you.
Understanding numerical algorithms necessarily involves some arithmetic and algebra as well as just logic. I would strongly recommend you dip into Knuth's wonderful Art of Computer Programming. This is an encyclopaedic work but quite easy to use as a a very readable reference once you read the introductory sections. Volume 2 includes a discussion of the classical "pencil and paper" algorithms for the basic arithmetic operations.