Exact concept of Monte Carlo Method

959 Views Asked by At

I am a programmer and just came across the section where in Monte Carlo was discussed. I would like to know the exact concept of Monte Carlo simulation. In net i have read about it that it is the computational algorithms dependent on repeated random sampling to have the numerical result. but i want the exact concept of it as in from the programmers point of view. I am not getting the essence of Monte carol algorithm

Edit: For example creating the random number series so that every time number does not repeat.

3

There are 3 best solutions below

0
On

Monte Carlo methods are extremely useful to calculate approximate solutions for definite integrals.

Since functions that have a primitive in closed form are few, having numerical ways of approximating the values of an integral is an important instrument to keep in your toolbox. Common methods for doing this are discretizing the integration interval and approximating the integral on each of the subsets of the interval of integration $[a,b]$ that we have considered (e.g. approximating through Riemann-sums)

Stochastic methods for approximating integrals numerically work in a different way: two numbers, the required approximation and the maximum acceptable probability that the error is greater than the required approximation are fixed. We don't require the certainty that the error will be in a certain interval, but we fix an acceptable probability that the error will be in this interval.

This approximated method has the advantage that it only requires a random number generator, but for single variable calculus deterministic approaches are more efficient, because Monte-Carlo method requires many random numbers to be generated. This changes radically in multi-variable calculus, where Monte-Carlo algorithms are far more efficient than deterministic ones when functions start getting complicated.

0
On

Given $f$ an integrable function on $[0,1]$, we have:

$$ I:=\int_0^1 f(x)\, dx = \mathbb{E}\left[f(U) \right] $$

with $U$ random variable uniformly distributed over $[0,1]$. Drawing independent points $U_1,U_2,\ldots$, one can get a Monte Carlo estimation of the expectation, hence of the original integral: $$ I_n:= \frac{1}{n}\sum_{i=1}^nf(U_i).$$

It can be shown that $I_n$ converges to $I$ with probability $1$ for large $n$. For square integrable functions it can be shown that the error, $I_n-I$, has mean $0$ and standard deviation inversely proportional to $\sqrt{n}$.

Programming-wise, all that's needed are good random number generators (RNG's).

2
On

The Monte Carlo method is used for problems where the input parameters are not absolutely constrained. If there are a lot of different input parameters, it is impractical to try out every possible combination of parameters, so a large but not too large collection of possibilities are trialed by randomly selecting the values of the parameters from a predefined range of acceptable values (which may be probability-weighted within that range). The collection of results can then be analysed as a probability distribution telling, for example, what is the mean and median expected result, what is the probability of exceeding some value or another, and so on.

In my work with groundwater solute transport, I try to estimate the concentration of water at a well given a certain source of solute. The result can depend on travel time, chemical reaction rate, infiltrating rainfall rate among others parameters, most of which it is impossible to be sure about the true value. So I decide what range of values each parameter could possibly have and decide whether each value is more likely to have a certain value within that range, and define probability density functions in this way. I use a random number generator to randomly select a value for each parameter within that range, so that it is more likely to select values at higher probability density. The equation is evaluated perhaps 10000 times with different, randomly selected parameter sets and then the results are collected together, ordered, and presented as a cumulative probability plot.