Ratio of expected values

574 Views Asked by At

Let's say that I have random variables, $A$ and $B$, and, given $N$ events, $B$ is a count of things that are a subset of what $A$ is a count of. Can I use $\frac{\mathbb{E}[B]}{\mathbb{E}[A]}$ as the probability of $B$ given $A$? Note that I am not asking about $\mathbb{E}\left[\frac{B}{A}\right]$, which I do not believe you can get an expectation for. However, it seems to me that since $\mathbb{E}[B]$ and $\mathbb{E}[A]$ are both deterministic values, then $\frac{\mathbb{E}[B]}{\mathbb{E}[A]}$ should be as well, and should in fact refer to the probability of $B$ given $A$.

If you need a concrete example, let's say that, for each time period, I can either throw a dice or not throw a dice, and $A$ represents the number of times I throw the dice for $N$ time periods. Then, on my roll, I can get a 6 or not get a 6, and $B$ represents the number of times that I get a 6. Therefore, if I know the expected totals of each, I can calculate the probability of whether I get a 6 if the dice is rolled by the ratio of the expected value.

Is this correct or incorrect?

2

There are 2 best solutions below

0
On

$\mathsf E(B)/\mathsf E(A)$ is a constant.   $\mathsf P(B=b\mid A=a)$ is a function whose value depends on the parameters $a,b$ and the joint distribution of the random variables.   Generally, they should not be equal.

0
On

As @Math1000 says, the terminology used in your question makes the question nonsense. "Probabilities of random variables" are not a thing.

To write $P(A)$ and $P(B)$, we need $A$ and $B$ to be events. That is, they are not measurements like "he is 1.56 meters tall" or "I rolled a 3 on this dice", but boolean ("yes/no", "true/false") outcomes of some real-world process. For example, an event could be "I went to work today" or "I rolled a 5 on this 6-sided die", or even "3 of the 10 dice rolls were even numbers".

By contrast, random variables are functions which describe a certain measurement associated with each possible outcome. For example, the following can all be described by random variables:

  • How tall is this person?
  • How many times did you go to work this week?
  • What value did you roll on the die?
  • How many of your 10 dice rolls were even?

Formally, the probability function $P$ is a measure on the set of events, and so expressions like $P(A)$ and $P(B)$ are well-defined, where $A$ and $B$ are events, not random variables. In order to get the probability of making a certain measurement, we need to construct an event from a random variable, and then take the probability of that event.

For example, consider the probability space of rolling two fair 3-sided dice together. Then the sample space $\Omega$ is the set

$$ \begin{align} \big\{ & (1,1), (1,2), (1,3), \\ & (2,1), (2,2), (2,3), \\ & (3,1), (3,2), (3,3) \big\}, \end{align} $$

where each element of the sample space is a pair whose elements describe the roll of the first die and the second die, in that order. Each element of this set is called an outcome, and an event is a set of outcomes. An event is said to occur if one of the outcomes it consists of occurs. Here, we see that, for any $\omega \in \Omega$, if we let $E = \{\omega\}$ be an event, then $P(E) = \frac{1}{9}$ (by definition, since there are 9 possible outcomes and the dice are fair).

A random variable is a function which takes an outcome as input, and outputs the associated measurement. Consider the random variable $X$ which describes the total/sum of a combined roll of these two dice. Then, for example, $X(1,3) = 4$, $X(2,1) = 3$, etc.. In order to construct an event from $X$, we need to consider a particular value that the dice roll total might be. Then an event $E$ like "The total of the roll is 4" can be constructed as

$$E := \big\{ \omega \in \Omega \:\big|\: X(\omega) = 4 \big\}.$$

The probability of this event is then $P(E)$; or, written in full,

$$P \bigg( \big\{ \omega \in \Omega \:\big|\: X(\omega) = 4 \big\} \bigg).$$

However, such expressions quickly become horribly tedious to write and interpret, so we just use expressions like $X=4$ and $P(X=4)$ as shorthand for these, respectively. Expressions like $P(X)$ don't mean anything.


Let's say that I have random variables, $A$ and $B$, and, given $N$ events, $B$ is a count of things that are a subset of what $A$ is a count of.

It now rests upon you to try and clarify what you mean by this, if you think that the question you have in mind still makes sense. However, I shall try and interpret as best as I can for now.

Let $M$ be a set of all of the potential measurements of some quantity. Let $X: \Omega \to M$ and $Y: \Omega \to M$ be random variables such that, for all $\omega \in \Omega$, we have $X(\omega) \leq Y(\omega)$. Thus, in some sense, $X$ counts a subset of what $Y$ counts. Do you then ask the following question?

$$\text{For all }m \in M, \text{ is it true that }\frac{E(X)}{E(Y)} = P\big(X=m \:\big|\: Y=m\big)?$$

If not, then I reckon that clarifying what you mean by "$X$ is a count of things that are a subset of what $Y$ is a count of" in more formal, well-defined terms would likely make your question easily answerable.