Relationship between random and non random variables.

855 Views Asked by At

Let $X$ and $Y$ be random variables and $k$ be a non-random constant.

Assume that $Y = kX$. It would be contradictory to write $k = Y / X$, since $k$ is non-random. In general, this must mean that (some) ordinary manipulations cannot be used when we have expressions mixed with random variables and non-random variables.

I know that $Y$ and $X$ are measurable functions but I am not knowledgable enough to say if it's because of this we cannot write $k = Y / X$.

For instance, consider two non-random functions $y(t)$ and $x(t)$ such that y(t) = k x(t) then we have that $k = y(t) / x(t)$...

Do someone have an explanation for this? Where can I learn more about this?

3

There are 3 best solutions below

0
On

In this situation it is indeed risky to write $k=Y/X$ but this because random variable $X$ is actually a function that can take value $0$. In that case the RHS is not well-defined. If $X$ is a random variable that does now take value $0$ then there is no objection.

You could argue that on LHS we have a constant and on RHS we have a function, but that is not such a problem. We can just identify $k$ with a constant function on the same domain of $X$ and $Y$.

After all it does not hurt to write equalities like: $$1=\cos^2t+\sin^2t$$where the LHS shows a constant and the RHS a function on $t$.

Your objection "$k$ non-random" has the same character as the objection "$1$ is not a function".

2
On

Since $Y = kX,$ it follows that whenever $X \neq 0,$ $$ \frac YX = \frac{kX}{X} = k. $$ The ratio $\frac YX$ is as "random" as the answer to "pick a random number between $k$ and $k.$"

Here's a simpler example: Let $Z = -X.$ Then $$X + Z = 0.$$ Nothing contradictory about that, even though everything on the left is "random" and the right-hand side is constant. It's just another example of the sort of thing that can happen when one random variable is completely dependent on another: you can set things up so that any random variation in one variable cancels out the variation in the other.

0
On
  1. Intuitively:

You can think of the randomness as how $\frac YX$ would be equal to $k$. For example, it could be $\frac {2k}2, \frac{5k}5, \frac{500k}{500}$, but the fraction would surely simplify to k.

  1. Precisely (without measure theory):

All constant functions are random, by the definition of random variables. They just have a 'randomness' of zero, where 'randomness can be' made precise through a variance of zero.

For example, consider a fair coin flip where you get a payoff of 1 unit regardless of the outcome.

Our probability space is $(\Omega, \mathbb P)$

where $\Omega = \{H,T\}$ and $\mathbb P(H) = \mathbb P(T) \le 0.5$

The payoff can be modelled by a constant random variable

$$X=1 \ \text{if} \ \omega = H$$ $$X=1 \ \text{if} \ \omega = T$$

Observe that $E[X] = 1$ which coincides with $E[1] = 1$.

Observe that $Var[X] = 0$ which coincides with $Var[1] = 0$.

  1. Precisely (with measure theory)

$X$ is a random variable in a probability space $(\Omega, \mathscr F, \mathbb P)$ if $\forall \ B \in \mathbb B$, $X^{-1}(B) \in \mathscr F$, that is for whatever value we want from X, the events that give rise to that value are going to be in the $\mathscr F$ of our probability space, where our choices for $\mathscr F$ are either $\mathscr F_0 := \{\Omega, \emptyset\}$ or $2^{\Omega} := \{\emptyset, \Omega, H, T\}$.

The possible $B$'s that we have are:

  1. $\{B \in \mathscr B | 1 \in B \}$ eg $\{1\},(0,\infty),\{1,2,3,...\},\mathbb R$

  2. $\{B \in \mathscr B | 1 \notin B \}$ eg $\{\pi\},(-\infty,0),\{-1,-2,-3,...\},\mathbb Q^C$

For the former kind of $B$'s, $X^{-1}(B) = \Omega \in \mathscr F$ for either of the two choices for $\mathscr F$, that is, the events that give rise to a value of $1$ are $H$, $T$ or $\Omega$ or simply $\Omega$ ('or' is $\cup$).

For the latter kind of $B$'s, $X^{-1}(B) = \emptyset \in \mathscr F$ for either of the two choices for $\mathscr F$, that is, the events that give rise to a value of anything but $1$ do not exist! They are known collectively as $\emptyset$.


Reference: Perhaps Introduction to Mathematical Statistics and Its Applications by Larsen & Marx or in Probability with Martingales by David Williams. If none, there may be some related concept. Also:

  1. https://en.wikipedia.org/wiki/Degenerate_distribution

  2. https://stats.stackexchange.com/questions/192179/