Imagine a language of finite sequences of 0 and 1. The rules for simplifying strings in this language are given by:
l??x => x1101
0??x => x00
In these rules, the variable x denotes an arbitrary sequence of 0s and Is and the sign '?' denotes a single 0 or 1. Construct an expression for which the reduction process does not terminate.
I have across this problem in "Introduction to Functional Programming" by Richard Bird, and I can't find the solution. Can someone guide me on the methology to solve this?
im not sure if this is correct, but ive written this:
So, with 10100 the reduction proccess goes as follows: