I am not given figures to answer this question. Whats the right approach?

105 Views Asked by At

Z is a random variable defined as the sum of N independent Bernoulli trials where the probability of each Bernoulli taking the value 1 is given by p. The number of Bernoulli trials N is itself a random variable that behaves according to a Poisson distribution function with the parameter lambda.

  1. What would you expect the correlation coefficient between Z and N to be?
  2. Describe the effect of lambda on the mean of Z
  3. Explicitly derive the marginal distribution function of Z

My attempt (please fix this):

  1. The correlation coefficient of X and N in this case will be positive. because E[z|N=n] = n * p. then as N=n gets larger and larger and p is constant, the expected value of z for that value of n would also increase. This shows that the two variables move together at the same direction

  2. lambda = E(N) therefore as lambda increases, the expected value of N=n would also increase. Since Z is binomial therefore has shape paramaters of n*p, as the expected number of trials increases, E(Z) also increases. Here, there is a positive relationship between lambda and E(Z)

  3. help please. im a lost soul on this one

1

There are 1 best solutions below

6
On BEST ANSWER

Observe that if $X_1,\dots,X_n$ are independent, identically distributed (i.i.d.) random variables, all Bernoulli distributed with success probability $p$, then $$X = \sum_{k=1}^n X_k \sim \mathcal{B}(n,p) $$ that is $X$ has a binomial distribution.

Your model is $Z|N\sim \mathcal{B}(N,p)$ is a binomial variable and $N$ is a Poisson variable $N\sim \mathcal P(\lambda)$.

So you have $\Bbb E(Z|N)=Np$ and $\Bbb E(Z)=\Bbb E\big(\Bbb E(Z|N)\big)=\Bbb E(Np)=\Bbb E(N)\,p=\lambda p$

The model for a typical observation $X$ conditional on unknown parameter $\theta$ is $f(x|\theta)$. As a function of $\theta$, $f(x|\theta)$ is called likelihood. The functional form of $f$ is fully specified up to a parameter $\theta$. The parameter $\theta$ is supported by the parameter space $\Theta$ and considered a random variable. The random variable $\theta$ has a distribution $\pi(\theta)$ that is called the prior. The distribution $h(x,\theta) =f(x|\theta)\pi(\theta)$ is called the joint distribution for $X$ and $\theta$. The joint distribution can also be factorized as $$ h(x,\theta) =\pi(\theta|x)m(x) $$ and the distribution $\pi(\theta|x)$ is called the posterior distribution for $\theta$, given $X=x$. The marginal distribution $m(x)$ can be obtained by summing on $\theta$ (or integrating out $\theta$ for continuous distributions) from the joint distribution $h(x,\theta)$ $$ m(x)=\sum_{\theta\in\Theta}h(x,\theta)=\sum_{\theta\in\Theta}f(x|\theta)\pi(\theta) $$ Therefore, the posterior $\pi(\theta|x)=\frac{h(x,\theta)}{m(x)}$ can be expressed as $$ \pi(\theta|x)=\frac{f(x|\theta)\pi(\theta)}{\sum_{\theta\in\Theta}f(x|\theta)} $$

Thus for the model $Z|N\sim \mathcal{B}(N,p)$ and $N\sim \mathcal P(\lambda)$ we have that the joint distribution has density $$ h(z,N)=f(z|N)\pi(N)=\binom{N}{z} p^z(1-p)^{N-z}\times \frac{\lambda^N e^{-\lambda}}{N!}=\pi(N|z)m(z) $$ with $N\ge 0$. Summing on $N$ the joint distribution $h(z,N)=\frac{p^z\lambda^z}{z!}\mathrm e^{-\lambda}\frac{[(1-p)\lambda]^{N-z}}{(N-z)!}$ we find the distribution of $Z$ $$ m(z)=\sum_{N\ge z}^{\infty}h(z,N)=\frac{p^z\lambda^z}{z!}\mathrm e^{-\lambda}\sum_{N-z=0}^{\infty}\frac{[(1-p)\lambda]^{N-z}}{(N-z)!}=\frac{p^z\lambda^z}{z!}\mathrm e^{-\lambda}\times \mathrm e^{\lambda(1-p)}=\frac{(p\lambda)^z}{z!}\mathrm e^{-(p\lambda)} $$ that is $Z\sim \mathcal P(p\lambda)$.

The posterior $\pi(N|z)=\frac{h(z,N)}{m(z)}$ is $$ \pi(N|z)=\frac{[(1-p)\lambda]^{N-z}}{(N-z)!}\mathrm e^{-((1-p)\lambda)}\qquad\text{for}\; N=z,\,z+1,\,\ldots $$