What are the most fundamental applications of probability theory outside pure math?

589 Views Asked by At

What are the most important / fundamental / classical applications of probability theory outside of pure math?

What were some of the original "home runs" of probability theory -- things that we could not do before, but which were very useful. Things that would make people say, "Gosh, we've hit on a big idea here."

One example would be that probability is used in Quantum Mechanics. That's a good one, but I believe probability had already proved to be very useful even before QM came along.

2

There are 2 best solutions below

2
On BEST ANSWER

A few (of many) significant contributions.

Statistics - In General

The language of statistics is probability; parametric statistics relies on mapping real world data to known probability distributions.

Statistical tests then work by using known properties of specific distributions (Normal, $\chi^2$, etc.) to make inferences about the observed data, for instance via the Central Limit Theorem

Fitting of models (eg. estimating averages, and regression for example) rely on Maximum Likelihood Estimation, which take as a starting point an assumed probability distribution with fixed (unknown parameters) and use knowledge of the distribution, and the observed data, to derive estimates of the parameters.

That's not to mention that probability is often described as having been invented by Pascal and Fermat to resolve a question about a gambling game.

Statistics - Frequentist

The frequentist approach to statistics inherently assumes a law of large numbers; for example, a frequenist approach to identify the bias in tossing a coin would be to assume that there exists a value $p_h$, the probability of observing a head, and then toss the coin $n$ times, and record the number of heads $h(n)$ observed. They would then make the assumption

$$ \lim_{n \rightarrow \infty} \frac{h(n)}{n} = p_h.$$

To make this statement rigorous, one would use the law of large numbers.

Statistics - Bayesian

The Bayesian approach to statistics is heuristically described as having an initial belief about a model, and updating your belief based on observed data. For example, before tossing a coin you may believe that the probability of it showing heads is uniform on the interval $[0,1]$, and you look to refine this belief based on observed data. Bayesian analysis achieves this through applying Bayes Rule.

Moving away from purely statistical applications:

Numerical Analysis

Monte Carlo methods give rise to probabilistic evaluation of high dimensional integrals based on sampling.

In particular applications of Markov Chains led to the advent of Markov Chain Monte Carlo which provides a method of fast convergence of numerical approximation to integrals, and is essential for modern computation. Somewhat circuitously (since it is probability enabling probability!), this is probably best seen through enabling efficient application of Bayesian methods.

Finance

Much of modern finance is based on Stochastic Analysis, or Ito calculus. A particularly famous example is the Black Scholes formula for option pricing. The founders of this analysis were awarded the 1997 Nobel Memorial Prize in Economics for a new method to determine the value of derivatives. More recently it has often been cited as one of the drivers of the 2007-8 Financial Crisis, though this is disputed (see for instance, here and here)

Cryptography Modern cryptograhpy methods often rely on methods such as RSA encryption, which rely on the difficulty for computers to factize products of large prime numbers. To do this however, random prime numbers need to be chosen first. Therefore this relies on the field of (psuedo-)random number generation. Whilst many methods of random number generation do not rely on probability, some do. Further, assessment of the level of `randomness' achieved often relies on statistical statements.

Statistical Mechanics

There are many models from theoretical phyiscs which can be re-envisaged as probabilistic models.

In (classical) statistical mechanics this often follows by noting that the partition function $Z = \int e^{H}$ ($H$ is a Hamiltonian, and I am being deliberately vague in my notation) is an integral over a positive function, and as such defines the normalizing factor of a probability distribution $e^H/Z$. Then standard statistics about a probability distribution (mean, correlation, etc.) take on meanings from a physical perspective. The classical example is the Ising Model, though there are many more (Potts, dimers, spin glasses, self-avoiding walk, etc.).

In the case of quantum mechanical models, a similar starting point is taken but then Feynman's path integral allows one to link quantum models to models involving Brownian motion (or random walks). Examples include the probabilistic interpretation of the Bose Gas as a system of Brownian loops.

Combinatorics

Many statements in combinatorics can be recast in a probabilistic light, since enumerating probabilistic structures can be seen as equivalent to defining a probability distribution on those structures; for instance the field of Self Avoiding Random Walks.

Another link is through the probabilistic method which uses probability to proof the existence of certain combinatorial objects: i.e. by showing that the probability that a certain event has non-zero probability, you can infer that there are objects that satisfy a certain condition.

Ergodic Theory

Ergodic theory is an area in its own right, but again is closely linked to probability. One application is showing that almost all numbers are normal numbers, even though to date no single number has been proven to be normal; the proof of which relies on Birkhoff's Theorem

Number Theory

Finally a slightly more esoteric example, but probability theory is often used to provide heuristics for why classical number theoretic statements are true. A common starting point in such arguments is to assume that the prime numbers are uniformly distributed (this itself needs explanation, since you cannot have a uniform distribution on $\mathbb{N}$) and to use this to infer statements about the primes (eg. Goldbach's Conjecture, which has as of yet not been proven rigorously).

0
On

Outside pure maths, probability and statistics are

  • what allow (insurance companies / banks / almost any large scale company) to evaluate risks with such accuracy that they are able to devise plans that are not too expensive for the customer, yet profitable for themselves, and I'd say for these guys, this is by far the home run you're looking for.
  • used for medical purposes, on the one hand to understand and control the spread of diseases using genetics / heuristics (look up Florence Nightingale), on the other hand by seeking correlations between being sick and other factors in order to find unexpected possible causes.
  • used basically everywhere nowadays if you look carefully enough...