This is surely a simple question for many. Suppose a sum S of linked variables must be simulated:
- x: independent variable sampled from empirical pdf.
- a = f1(x); f1 is a function with uncertainty bounds, and so, a is also sampled.
- b = f2(a); f2 is a function with uncertainty bounds, and so, b is also sampled.
S = a + b + c ...
Suppose that a, b, c each are sampled with 1000 realizations (list of 1 x 1000).
for i <- 1 to 1000
a
for j <- 1 to 1000
b
for k <- 1 to 1000
c
S < - a + b + c
If the sum is organized this way, as a "nested" sampling scheme, the size of S grows quickly at 1000^n. This is the least efficient way to do it. Which better way of sampling workflow could be implemented?
This is a general question