I come across the beta distribution quite frequently when solving exercises for my statistics class. However, I have not been able to fully grasp how to work with it.
Exponential family form is:
$$ f(x)=h(x)c(\theta)\Big(\sum W_i(\theta)t_i(x)\Big) $$
Beta distribution is defined as:
$$ f(x) = \frac{x^{\alpha-1}(1-x)^{\beta-1}}{B(\alpha,\beta)}$$
Where $\alpha$, $\beta$ or both may be unknown.
Also when arranging the Beta distribution in the form of exponential family:
$$ f(x)= \frac{1}{B(\alpha,\beta)} e^{(\alpha-1)ln(x)(\beta-1)ln(x)}$$
Hence, $h(x)=I_{x\in(0,1)}(x)$ (If you see that $h(x) = 1$, that is a cue to use an indicator function that ranges through the support of $x$
What does this mean?
$\omega_1(k,\beta)=\alpha-1$
$\omega_2(k,\beta)=\beta-1$
$\tau_1(x)=ln(x)$
$\tau_1(x)=ln(1-x)$
How are these determined?
One of the exercises specified that the distribution is $B(\alpha,1)$ where $\alpha$ is unknown $\alpha>0$. Then the distribution becomes:
$$ f(x)= \alpha x^{\alpha -1}, 0<x<1$$
How can this be derived?
I have been reading about the Beta distribution, but i cannot find anything that is easy to understand for someone with minimal mathematical background.
The Beta distribution is useful for modeling random variables between $0$ and $1$
If $X \sim Exp(\lambda)$ is an exponentially distributed random variable then the mass function of
$$f(x;\lambda ) =\begin{align}\begin{cases} \lambda e^{-\lambda x} & \textrm{ for } x \geq 0 \\ \\ 0 & \textrm{ for } x < 0 \end{cases} \end{align}$$
Now if we add two exponentially distributed random variables $X=X_{1}+X_{2}$ we get
$$f_{X}(x) = \int_{-\infty}^{\infty} f_{X_{1}}(x-x_{2})f_{X_{2}}(x_{2}) dx_{2} $$ $$ f_{X}(x) = \int_{0}^{x} \lambda e^{- \lambda (x-x_{2})}\lambda e^{-\lambda x_{2} } dx_{2}$$ $ f_{X}(x) = \int_{0}^{x} \lambda^{2} e^{-\lambda x} dx_{2}$ $ \lambda^{2} x e^{-\lambda x}$ $f_{X}(x) =\begin{align}\begin{cases} \lambda^{2} x e^{-\lambda x} & \textrm{ for } x \geq 0 \\ \\ 0 & \textrm{ for } x < 0 \end{cases} \end{align}$
Now the gamma density as you said is
$ f(x,\alpha, \beta) = \frac{\beta^{\alpha}}{\Gamma(\alpha)}x^{\alpha-1}e^{-\beta x}$
So if we have $ X \sim Gamma(1,\lambda)$ we get an exponential distribution clearly.
$ f(x,1,\lambda) = \frac{\lambda}{\Gamma(1)}x^{0}e^{-\lambda x} = \lambda e^{-\lambda x}$
now the gamma function is simply
$$\Gamma(x) = \int_{0}^{\infty} u^{x-1} e^{-u} du$$
The connection with beta distribution is that the beta density is given as
$$f(x;\alpha, \beta) = \frac{\Gamma(\alpha+\beta)}{\Gamma(\alpha)\Gamma(\beta)}x^{\alpha-1} (1-x)^{\beta-1}$$ $$=\frac{1}{B(\alpha,\beta)}x^{\alpha-1} (1-x)^{\beta-1}$$ note here $$= \frac{x^{\alpha-1}(1-x)^{\beta-1}}{\int_{0}^{1}u^{\alpha}(1-u)^{\beta-1} du}$$ $B(\alpha, \beta) $is that integral on the bottom. essentially they are methods of talking about continuous probability distributions in time and the parameters of their shape.
Fairly clearly when we insert $\beta =1$ we will get an exponential density again
$$=\frac{1}{B(\alpha,1)}x^{\alpha-1} (1-x)^{0} = \frac{1}{B(\alpha,1)}x^{\alpha-1} = \frac{\Gamma(1+\alpha)}{\Gamma(1)\Gamma(\alpha)}x^{\alpha-1}$$ Note that $\Gamma(1) = 1 $, $ \Gamma(1+\alpha) = \alpha!$ , $\Gamma(\alpha) = (\alpha-1)! $ then we have
$$\frac{\Gamma(1+\alpha)}{\Gamma(1)\Gamma(\alpha)}x^{\alpha-1}= \frac{\alpha!}{(\alpha-1)!}x^{\alpha-1} = \frac{\alpha * (\alpha-1)!}{(\alpha-1)!}x^{\alpha-1} = \alpha x^{\alpha-1} $$
Also note now..that if $a=b=1$ then our density is
$$f(x;1, 1) = \frac{\Gamma(1+1)}{\Gamma(1)\Gamma(1)}x^{1-1} (1-x)^{1-1} =1$$
which is a uniform density