Computational Complexity of Equational Logic

96 Views Asked by At

Equational logic uses a surprisingly small set of axioms to prove all algebraic identities (algebraic in the sense of universal algebra, so things like field theory fall beyond this scope). This makes proving an identity in algebra seem pretty easy, an experience which matches experience, where usually you can find a proof quickly or even instantly, whereas disproving an identity can involve constructing a complex counterexample. I was wondering if this can be demonstrated formally. Specifically, suppose we have an efficient algorithm for doing equational logic. Given an algebraic identity consisting of $n$ symbols, what can we say about maximum length of time the algorithm will take to prove the identity asymptotically? Note that I am only interested in identities. There is no disproof in equational logic, so the algorithm will never halt on a non-identity.

1

There are 1 best solutions below

0
On BEST ANSWER

The short answer is: equational logic is as powerful as computation in general, so there is no computable bound for the time needed to prove equational problems.

One way to make this precise is the following: fix some complete algorithm $A$ for proving equations from finite sets of equations. Then there is no computable function $f: \mathbb{N} \to \mathbb{N}$ s.t. $A$ terminates with a proof on all provable inputs of size $n$ in $f(n)$ steps. Suppose such a function would exist. Let $\langle S, R\rangle$ be a finite presentation of a group with an undecidable word problem. Then the following would be a decision algorithm for the word problem in this group: given an input word $w$ let $n$ be the size of the axioms of groups plus the size of $\langle S,R \rangle$ plus the size of $w$. Compute $k=f(n)$. Run $A$ for $k$ many steps trying to prove $w = e$ from the axioms of groups and $\langle S, R\rangle$. If $A$ outputs a proof answer with yes, otherwise answer with no.