Is the mathematical concept of an "operation" necessarily deterministic?

891 Views Asked by At

Does the mathematical concept of an operation require that the process is deterministic?

If not, what are some example cases for non-deterministic operations?

Motivation: I am coming from a software background and want to reconcile the subtle distinctions between the semantic intention of operators/operations in software and in pure math. Obviously, operations in software applications can be non-deterministic, but I don't think this carries over to pure math.

4

There are 4 best solutions below

5
On

An operation is a type of function, and all functions by definition give exactly one result for each combination of inputs. That result might be defined in a way that's difficult to figure out, but there can never be more than one result.

In probability theory, they define random variables, which are by definition nondeterministic, and theorems can be proved about these.

0
On

Operations in computer science are nondeterministic mainly to the extent that they take nondeterministic inputs, which a mathematical operation is also capable of doing. If you repeated a "nondeterministic algorithm" on the exact same set of inputs, it would produce the exact same output. You can treat every algorithm, program, or circuit as a mathematical object, so every program you would consider nondeterministic is nondeterministic by that same token mathematically.

The trouble with considering things proven about random variables nondeterministic is they don't derive any information that depends on the variable's value at any one time, hence that information is no less determined than any other program's output. Likewise, fuzzy logic doesn't assign truth to a proposition some of the times you try to use it and not other times, consistent circumstances assign a consistent partial truth to every proposition.


EDIT: Regarding these behaviors as nondeterministic,

In those cases you could still model the entire computer deterministically, as one program. You could say therefore that while any program that compiles without an error is well-defined in the abstract, uncompiled code is not a well-defined representation of the compiled program, because it's phrased as if it doesn't change depending on the state of the computer. Compiled code, on the other hand, is defined in terms of operations which are sensitive to the state of the computer, and so all variation is accounted for.

Of course, up to the point where a computer's state behaves deterministically as a physical aspect of the computer. If you need to take heat noise into account, you step back again and, in the same way as requiring a program to be defined in terms of machine instructions which change its output, you require the machine instructions to be defined in terms of the electricity in the system, and assuming physical behavior is itself well-defined, there is a correct model of electricity which will determine the output of the program.

0
On

An operation in mathematics as well as in computer science is an algorithm that produces, given some input data of specified type (numbers, vectors, functions, etc.), some output data of specified type. As such it is deterministic: At each instant of the execution process you know exactly what to do next; but dependent on the environment, the algorithm could ask for a random number or even for an oracle, in which case the outcome is not "deterministic" as a function of the input. (Using a "pseudo random number" generator the process is of course still completely deterministic, but this is not the idea when the "operation" you are considering is asking for a radom number.)

0
On

In the "normal" setting of mathematics (not being a logician or set theorist, I do not know about any exotic axiomatic settings), there is no such thing as non-determinism because the concept of "randomness" is a human heuristic. There is actually a decent amount of philosophical discussion about the "meaning" of probability, of which I am no expert.

Consider the classic example: flipping a fair coin. Our human intuition is that the "randomness" means (according to the "frequency" interpretation) that every time we flip, we don't know whether it will be heads or tails, but it appears as though roughly half the time it will be heads and half tails when we flip many times.

However, the mathematical formulation is less illuminating. A probability measure is just a function that assigns to each outcome a positive number. In this case, we have that the outcome "heads" is assigned $1/2$, and "tails" likewise. Mathematically, then, the probability measure is just a deterministic, real-valued function.