How does one decide whether a theorem is qualitatively complex rather than hard?
Namely, how does one know when the proof of the theorem is irreducibly complex up to the least amount of computational steps required for a theorem's proof to Q.E.D.
How does one decide whether a theorem is qualitatively complex rather than hard?
Namely, how does one know when the proof of the theorem is irreducibly complex up to the least amount of computational steps required for a theorem's proof to Q.E.D.
Proof complexity is mainly studied for propositional calculus. To extend the notion to typical "real" math theorems that require the expressiveness of first-order logic, you could define the complexity of a theorem to be the number of steps in its shortest natural deduction proof.
I don't think there's a practical way to compute this, though; you'll essentially have to just enumerate all possible proofs in order of length, checking to see if they prove the given sentence. If you don't already know that the given sentence (or its negation) is provable (i.e. that the sentence is not independent), there's no way to know when to stop searching.
More precisely, there's no computable function $L$ from strings to numbers such that whenever $\varphi$ is provable, $L(\varphi)$ is an upper bound on the length of the shortest proof of $\varphi$. If there were, then we could solve the halting problem by letting $\varphi$ be the statement that a given program halts and then checking all possible proofs up to length $L(\varphi)$ to see if $\varphi$ is provable.
In other words, as we consider longer and longer sentences, we encounter theorems that require "very long" proofs: any upper bound on proof complexity grows faster than any computable function.