What is meant by "free choice" in mathematics?

125 Views Asked by At

The following quoted text was written before 1963, so the authors didn't have the benefit of decades of automated computing machines to inform their statements.

The essential feature of an algorithm is that it requires no inspiration or inventiveness but only the ability to recognize sets of symbols and to combine them and break them up according to rule prescribed in advance; in other words, in other words, to carry out elementary procedures that can in principle be entrusted to a machine.

[$\dots$]

In the above examples every stop is, in general, uniquely determined. But in other algorithms it may happen that each step depends upon a free choice among several (finitely many) possibilities. For example, consider the algorithm ($f$) which, when applied to two prescribed integers $a,b$ (in decimal notation), leaves open at each step a free choice between two possibilities: when two numbers (including $a$ and $b$) are already found, we may take (1) their sum or (2) their difference. This algorithm enables us to find all the numbers in the module $\left(a,b\right)$ generated by the two numbers $a$ and $b$.

Machines don't make "free choices". I am aware that some manufacturers have from time-to-time attempted to produce "truly random" number generators. But those ultimately appeal to some physical construct which is determined by the theories of physics to be "random".

I have no way of applying a meaningful mathematical interpretation to the term "free choice" in the above quoted text. Is "free choice" a term that has a precise mathematical definition?

2

There are 2 best solutions below

4
On BEST ANSWER

In this case, the quote is only talking about non-deterministic algorithms.

In this particular case, we can formally define "leaves open a free choice" by saying:

A particular output is possible from a given input if there is at least one sequence of choices that leads the algorithm to that output from that input.

If there is more than one option at some stage of the algorithm, then more than one output may be possible from a given input.

The term "free" only means that every possible choice at each step is considered when we ask about whether a particular output is possible; the "choices" are not constrained in any way.

Note that we do not define what "free choice" means - we define what it means for an algorithm to lead from an input to an output if there are possible choices along the way. This is similar to the way that "the limit of $f(x)$ as $x$ approaches $0$" is defined in calculus without actually defining what it means to "approach zero".

As an example taken from Wikimedia Commons here, the following flowchart shows a process that begins with a "free choice" of going from $S_0$ to $S_1$ or $S_2$. This flowchart accepts the regular language $(1^*(01^*01^*)^*) \cup (0^*(10^*10^*)^*) \cup \epsilon$; the top half accepts the first option, while the bottom half accepts the second.

When we talk about "running" the algorithm on a specific input, we formally mean to consider every path through the flowchart that agrees with the input, and we say that the input is accepted if there is at least one path that leads to an accepting state. By considering every path, we consider every possible sequence of choices; this can be colloquially defined as imagining that someone "makes a free choice" whenever a choice is possible.

NDFA

0
On

The rationale behind free (or arbitrary) choices, is that the algorithm will remain correct whatever the choice(s). Leaving the choice open may create opportunities for optimizations. This option is left to the implementor who will usually define a deterministic choice rule.

Machines making truly random choices are rare, and not so useful (except maybe in high-end cryptographic applications).