A random variable which distribution is also random

82 Views Asked by At

I feel like that question's got an obvious answer, but I somehow missed it during my probability class. There are random variables, which distributions can be expressed if a form of functions - like Gaussian, uniform, binomial etc. If I'm going to take a ruler and measure the length of my laptop again and again, all the outcomes will fit the bell-shape curve, and the more measures I do, the better and smoother this curve will be. There is a certain (large) number of measures I need to do to get a nice bell curve, and when I do that sufficient number of measures, I always get the same (at least, the-same-Gaussian-nature) distribution.

But is it possible to find a random variable, which distribution isn't going to settle down to some concrete form or function? Like when I try to repeat my experiment, there's no such large number of measures that'll give me the same result over and over, the result always changes - when I measure my imaginary laptop today, I get one curve, tomorrow - the other and so on, and all the curves are completely random and don't look the same.

Just by gut feeling, there are natural processes like that - when I think of it, stock price dynamics looks quite similar. If that's true, are they studied by probability theory, is there a special name, maybe? Or maybe I'm wrong, and given a certain, really huge amount of outcomes from a stock market, we'll see that it's got a meaningful distribution too?

3

There are 3 best solutions below

1
On

I think what you describe is a stochastic process $\{X_n\}_{n\in\mathbb{N}}$ in which each variable $X_n$ can have a different distribution. Unfortunately, if there is no structure in these distributions, not many interesting things can be concluded from such a process. A type of stochastic process which is studied and used intensively is the Markov process in which we assume that $X_n|X_{n-1},\ldots,X_1 \sim X_n|X_{n-1}$.

In your laptop example, the use of a stochastic process is probably redundant by the central limit theorem. Pretty safe assumptions on your measurements would be that they are all independent and identically distributed with finite first and second moment. In this case the central limit theorem tells us that \begin{equation} \sqrt{n}(\frac{1}{n}\sum_{i=1}^nX_i - E(X)) \stackrel{d}{\rightarrow} N(0,var(X)). \end{equation}

0
On

Suppose $X$~$N(\theta, \sigma^2)$, while suppose $\theta$ here is also a variable and $\theta$~$N(\mu_1,\sigma_1^2)$, the idea behind it is to assume the parameter is itself dynamic. $\mu_1$ here is called hyperparameter, you can add infinite hyperparameters if you want and it may be called a hierarchical model.

Your thought may be alike "Bayesian" models. Take a look at http://en.wikipedia.org/wiki/Prior_probability

0
On

If that's true, are they studied by probability theory, is there a special name, maybe?

Hidden variables models, for example hidden Markov models.