Diagonalization principle has been used to prove stuff like set of all real numbers in the interval [0,1] is uncountable. How is this principle used in different areas of maths and computer science (eg. theory of computation)?
2026-04-09 13:00:22.1775739622
On
Diagonalization Principle
1.2k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
0
On
I'm not sure what level of answer you are looking for. At an introductory level, Wikipedia has some relevant pages linked here.
Books that touch on the elementary theory of computation will have diagonal arguments galore. For example, my Introduction to Gödel's Theorems (CUP, 2nd edn. 2013) has lots!
A step up in sophistication, there is a nice paper on 'A Universal Approach to Self-Referential Paradoxes, Incompleteness and Fixed Points' by Noson S. Yanofsky The Bulletin of Symbolic Logic, Vol. 9, No. 3 (Sep., 2003), pp. 362-386. There is an arXiv version of the paper here. This paper tries to locate one template for diagonal arguments which can then be applied in multiple contexts.
There is an interesting result that the set of languages expressible with a Turing machine is a proper subset of all languages.
http://www.cs.ox.ac.uk/people/paul.goldberg/FCS/slides8.pdf
The outline goes something like the set of all Turing machines is countable and the set of all languages is uncountable therefore there exists at least one language which is not expressed by a Turing machine.
A similar diagonalization principle is used to show the set of all languages is uncountable.