You can draw from the Cauchy distribution by attaching a stick to a spindle somewhere on the y-axis, spinning it, and reading off the x-intercept as your drawn value. Where you place the spindle on the y-axis parameterizes the distribution.
Are there (practical) physical processes for other distributions? Especially ones that could actually generate all possible values? Throwing darts at a dart board, for instance, is a poor process for the normal distribution since there’s a bound on how far you’ll ever get from the center.
For the Geometric distribution you can roll a die until you get a 1, say, though this requires an appropriate die for each value of the single parameter, so it’s not as ideal as the Cauchy case.
EDIT: In particular I'm thinking of processes that give you some physical intuition for the distribution. Sure you can convert a randomly drawn number from one distribution into a draw from another distribution, but that doesn't give you any intuition for the second distribution.
Ok, this may not help. This is from probability. Let’s first consider before the gamma function a different function. The exponential density. If $X \sim Exp(\lambda)$ then it has a probability density function given by $$f(x;\lambda ) =\begin{align}\begin{cases} \lambda e^{-\lambda x} & \textrm{ for } x \geq 0 \\ \\ 0 & \textrm{ for } x < 0 \end{cases} \end{align}$$
Ok, now first an exponentially distributed random variable describes the time between events for Poisson point process.
Now the sum of two exponential exponential random variables is a gamma distributed random variable. The sum of two random variables is given by their convolution which is like this. $X= X_{1} +X_{2}$ $$f_{X}(x) = \int_{-\infty}^{\infty} f_{X_{1}}(x-x_{2})f_{X_{2}}(x_{2}) dx_{2} $$ $$ f_{X}(x) = \int_{0}^{x} \lambda e^{- \lambda (x-x_{2})}\lambda e^{-\lambda x_{2} } dx_{2}$$ $ f_{X}(x) = \int_{0}^{x} \lambda^{2} e^{-\lambda x} dx_{2}$ $ \lambda^{2} x e^{-\lambda x}$ $f_{X}(x) =\begin{align}\begin{cases} \lambda^{2} x e^{-\lambda x} & \textrm{ for } x \geq 0 \\ \\ 0 & \textrm{ for } x < 0 \end{cases} \end{align}$
now the gamma density function is $ f(x,\alpha, \beta) = \frac{\beta^{\alpha}}{\Gamma(\alpha)}x^{\alpha-1}e^{-\beta x}$
and the gamma function is $$\Gamma(x) = \int_{0}^{\infty} u^{x-1} e^{-u} du$$ the relationship here is the exponential density is telling us the time between events..the gamma density is an exponential density with a shape and scaling parameter. Then the gamma function is the integral of this. It is telling us the total amount of times something happens typically. Usually the gamma function is modeling insurance things I believe. The relationship between the gamma function and the beta function is the beta density is $$f(x;\alpha, \beta) = \frac{\Gamma(\alpha+\beta)}{\Gamma(\alpha)\Gamma(\beta)}x^{\alpha-1} (1-x)^{\beta-1}$$ $$=\frac{1}{B(\alpha,\beta)}x^{\alpha-1} (1-x)^{\beta-1}$$ note here $$= \frac{x^{\alpha-1}(1-x)^{\beta-1}}{\int_{0}^{1}u^{\alpha}(1-u)^{\beta-1} du}$$ $B(\alpha, \beta) $is that integral on the bottom. essentially they are methods of talking about continuous probability distributions in time and the parameters of their shape.
For some more distributions.
If $ X \sim Bin(n,p) $ is a binomial random variable we can see from the mass function what it tells us about.
$$ f(k;n,p) = Pr(X=k) = \binom{n}{k} p^{k}(1-p)^{n-k} $$ is the probability of success in n independent trials for k successess with probability p.
with the Poisson case we have $ X \sim Pois(\lambda)$ is related to our exponential distribution
$$ f(k;\lambda) = Pr(X=k) = \frac{\lambda^{k}e^{-\lambda}}{k!}$$ is the probability of k events happening given our parameter lambda. This could apply to many things. They use it in insurance as well. They also use it in bioinformatics like cancer for the number of genetic defects. Very practical application is finance. Like the number of time a stock goes up an down in a time frame.