Chains of random variables

68 Views Asked by At

Domain: estimating and project management.

Let $A_1$ through $A_n$ be random variables with known properties (I'll elaborate later as required). Assume that each $A_i$ is independent but together they form a chain $A=\sum\limits_{i=1}^n{A_i}$.

How would one calculate properties of $A$? Generally I'm interested in things like expected value and confidence intervals at, say, 95%.

Let's further assume that there's a similar chain $B$ and that $C=A+B$. Would it be meaningful to first calculate properties for $A$ and $B$ and then combine them to $C$?

Pointers to literature (preferably online) much appreciated.

2

There are 2 best solutions below

0
On BEST ANSWER

Assuming all expectations and variances are finite:

For expectation, you have in general $E[A]=\sum\limits_{i=1}^n{E[A_i]}$.

If the $A_i$ are independent, you have $Var[A]=\sum\limits_{i=1}^n{Var[A_i]}$. The variance is the square of the standard deviation.

If the $A_i$ are independent and normally distributed, $A=\sum\limits_{i=1}^n{A_i}$ is also normally distributed. This should enable you to come to results on confidence intervals.

For $C=A+B$, just apply those results when $n=2$.

0
On

Look up central limit theorem. If all the $A_i$ come from the same distribution and n is large enough A will come from a normal distribution with mean and variance related to mean and variance of the distribution of $A_i$. When adding A and B then just follow the rules of adding normal RVs.