In the book Pattern Recognition and Machine Learning, CM Bishop has elaborated on calculating marginal, conditional and joint probabilities by creating a table with rows and columns being outcomes of individual random variable. The marginal probabilities can hence just be calculated by summing up the desired row and dividing by the total number of possible outcomes. More over here.
A similar approach was followed in the following video to calculate the marginal probabilities.
I can't seem to construct a similar table for the following problem.
I have two boxes: a Red one containing 2 Apples and 6 Oranges & a Blue one containing 1 Apple and 3 Oranges. X & Y are 2 random variables for choosing Box and Fruits respectively.
I understand that you need to have marginal probability for each box as it doesn't depend on the amount of ball it contains. I just can't wrap my mind why this method doesn't work? Also, is there a way to construct such table if I was given the marginal probabilities (of Box Red and Blue) beforehand?