X is a vector of N independent normally distributed random variables (X1...XN), where Xi ~ N(μi, σi). Y is a vector of length K, where Yj is a sum of a subset of X. What is the probability that every Y is less than some value z?
I'm not quite sure what the technical terms are for what I'm looking for, or how to approach it. Y1 will be correlated to Y2 if they share a common subset of X, and that correlation can be calculated, but I'm not sure where to go from there.
If it makes it easier, assume Y just has length 2 (K=2).