Calculating Probability Distributions

104 Views Asked by At

I've posted a few times about this specific distribution question, but I am confused about what exactly my professor is asking for here. The homework question is:

A physical measurement of $x$ gives results for $-\infty < x < \infty$ with probability distribution (or density) given approximately by the distribution \begin{equation}\label{prob2} p(x) = \frac{a}{\pi (a^2 + x^2)} \end{equation} with $a<0$.

with my question at part (d),

Calculate the probability distribution $P(X)$ of the average, $X$, of $M$ values of x, each drawn from the distribution of $p(x)$.

I do not know what this means. My guess would be to take $p(x)$ from $-M < x < M$ and get a new PDF, but this doesn't make a whole lot of sense to me. I am asked this same question across a few different problems, so I'd like some clarification before moving on to the others. Any help interpreting this would be appreciated.

Thank you.

2

There are 2 best solutions below

6
On BEST ANSWER

We say that $Y$ is drawn from the distribution of $p(x)$ if it has the probability density $p$. What is mean is for you to take new (independent) random variables $X_1, \ldots, X_M$, all of which have the same density, $p(x)$. Then the average of these is $$X=\frac{1}{M}\sum_{i=1}^M X_i.$$ This is yet another random variable, and the problem is asking you to find the distribution of $X$.

EDIT: This is a common technique in statistics to determine an unknown parameter. Here is an example which has a different distribution than the one given in your problem, but perhaps which is better for understanding. Let's say we have $0\leqslant q\leqslant 1$ and a coin which comes up heads with probability $q$ and tails with probability $1-q$. If the coin comes up heads, we get one dollar, and if it comes up tails, we don't win or lose anything. Then we can let $X$ be our winnings after one flip. Then $X$ has some distribution. Now suppose that instead of flipping the coin once, we flip it $M$ times. Let $X_i$ be the winnings from the $i^{th}$ flip (so $\mathbb{P}(X_i=1)=q$ and $\mathbb{P}(X_i=0)=1-q$). Then we may define the average winnings, $$S=\frac{1}{M}\sum_{i=1}^M X_i.$$ This is a sum of Bernoulli random variables, and $$\mathbb{P}(S=\frac{j}{M})=\binom{M}{j}q^j (1-q)^{M-j}$$ for $j=0, 1, \ldots, M$.

Thus we had some initial distribution (the distribution of a single flip, which is the observation of a single measurement/flip). Then we take $M$ independent measurements/flips and average the results.

Now let us use your specific distribution, and let us consider the case $M=2$. For concreteness, let's say we are measurig the position of a particle along the real number line, and for $c<d$, the probability that we observe the particle between positions $c$ and $d$ is given by $$\mathbb{P}(c\leqslant x\leqslant d)=\int_c^d \frac{a}{\pi(a^2+t^2)} dt.$$

Now let's say rather than measuring the particle's position once, we measure it twice. We get observations $x$ and $y$ from the two tests (and the tests are independent). Then we can consider the average of the two measurements, $2^{-1}(x+y)$. We want to know $$\mathbb{P}(c\leqslant \frac{x+y}{2}\leqslant d).$$ An expression for this in terms of $c$ and $d$ would tell us the distribution of $\frac{x+y}{2}$. This is the $M=2$ case of your question, and it can be solved by taking a convolution (as this edit is meant to explain the concepts, I will assume you know how to find the density of the sums of random variables whose individual densities are known by convolving). We are measuring the position of the particle twice, but the probability of finding the particle in a particular position does not change. Therefore the two measurements $x$ and $y$ both have density $p$.

A complete answer to the question would be to measure the particle's position $M$ times to obtain measurements $x_1, \ldots, x_M$. We then find the average, $\hat{x}:=\frac{1}{M}\sum_{i=1}^M x_i$. We would like to know the density of $\hat{x}$, which would mean we want to find an expression for $$\mathbb{P}(c\leqslant \frac{1}{M}\sum_{i=1}^M x_i\leqslant d)$$ in terms of $c$ and $d$.

3
On

The distribution provided in the question is Cauchy distribution (in fact a transformed form of it). The question asked for the distribution of $\frac{1}{M} \sum_{i=1}^M X_i$, with $X_i$s as independent and identically distributed Cauchy random variables. Note that the sum of $M$ independent Cauchy random variables is Cauchy itself with parameters added (see here). Since all the random variables here are identically distributed, the empirical mean random variable would be distributed as the same Cauchy distribution in the original question.

A similar question can be found here.