So in my textbook it states $$\text{Let $X_1$ and $X_2$ be independent and }X_i\sim \Gamma(\alpha_i,\beta),i=1,2.\text{ Then }\\ (i)\frac{X_1}{X_1+X_2}\sim \operatorname{Beta}(\alpha_1,\beta)\\(ii) X_1+X_2\text{ and }\frac{X_1}{X_1+X_2}\text{ is independent}$$ There is a proof for $(i)$ but not a proof for $(ii)$ and I'm having trouble proving it. A link or an answer or maybe just a hint would be appreciated!
2026-03-25 13:57:07.1774447027
Independence of $X_1+X_2$ and $\frac{X_1}{X_1+X_2}$ where $X_1,X_2$ are gamma distributed
170 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY-DISTRIBUTIONS
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- Statistics based on empirical distribution
- Given $U,V \sim R(0,1)$. Determine covariance between $X = UV$ and $V$
- Comparing Exponentials of different rates
- Linear transform of jointly distributed exponential random variables, how to identify domain?
- Closed form of integration
- Given $X$ Poisson, and $f_{Y}(y\mid X = x)$, find $\mathbb{E}[X\mid Y]$
- weak limit similiar to central limit theorem
- Probability question: two doors, select the correct door to win money, find expected earning
- Calculating $\text{Pr}(X_1<X_2)$
Related Questions in INDEPENDENCE
- How to prove mutually independence?
- Simple example dependent variables but under some conditions independent
- Perturbing equivalent measures
- How to prove conditional independence properties
- How do I prove A and B are independent given C?
- Forming an orthonormal basis with these independent vectors
- Independence of stochastic processes
- joint probability density function for $ X = \sqrt(V) \cdot cos(\Phi) $ and $ Y = \sqrt(V) \cdot sin(\Phi) $
- How predictable is $Y$, given values of $X_i$s?
- Each vertex of the square has a value which is randomly chosen from a set.
Related Questions in GAMMA-DISTRIBUTION
- Gamma distribution to normal approximation
- Conditional density function with gamma and Poisson distribution
- sum of two independent scaled noncentral $\chi$-squared random variables
- Expectation of the ratio between Beta-Prime and Gamma random variables
- It is given that $X_i \sim^{\text{ind}} \text{Gamma}(\alpha,p_i)$ Find the distributions of $Y_i=\frac{X_i}{X_1+X_2+...+X_i}$, where $i=2,3,..k$
- Finding the pdf of a random variable generating from another random variable with defined pdf
- Claims per policyholder follows a Poisson dist. but mean varies according to a Gamma distribution
- How to prove the sum of sample is the complete statistics for gamma distribution?
- How to solve or approximate this special integral related to inverse gamma distribution
- Calculating the probability that one analyst is correct over another
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Standard proof:
Let $Y_1 = X_1 + X_2 \sim Gamma(\alpha_1+\alpha_2,\beta)$ and $Y_2=\frac{X_1}{X_1+X_2} \sim Beta(\alpha_1, \beta)$.
Then
$X_1 = Y_1\cdot Y_2 = v_1(Y_1,Y_2)$,
$X_2 = Y_1 - X_1 = Y_1 - Y_1\cdot Y_2 = Y_1(1-Y_2) = v_2(X_1,X_2)$.
By the change-of-variables theorem, we have that $$ f_Y(y_1,y_2) = |J_v| f_X\circ v(y_1,y_2) = |J_v| f_{X_1}(y_1\cdot y_2) f_{X_2}(y_1\cdot(1-y_2)) $$
Computing the jacobian is straightforward:
$$ |J_v| = \begin{vmatrix} \frac{d v_1}{dy_1} & \frac{d v_1}{dy_2} \\ \frac{d v_2}{dy_1} & \frac{d v_2}{dy_2} \end{vmatrix} = \begin{vmatrix} y_2 & y_1 \\ 1-y_2 & -y_1 \end{vmatrix} = |-y_1 y_2 - y_1(1-y_2)| = |-y_1| \stackrel{(1)}{=} y_1 $$
(1) since the support of a Gamma is positive
so we have that
$$ f_Y(y_1,y_2) = y_1 f_{X_1}(y_1\cdot y_2) f_{X_2}(y_1\cdot(1-y_2)) = y_1 \frac{1}{\Gamma(\alpha_1)} \frac{1}{\beta^{\alpha_1}} (y_1\cdot y_2)^{\alpha_1-1} e^{-y_1\cdot y_2 / \beta} \frac{1}{\Gamma(\alpha_2)} \frac{1}{\beta^{\alpha_2}} (y_1\cdot(1-y_2))^{\alpha_2-1} e^{- (y_1\cdot(1-y_2)) / \beta} \stackrel{(2)}{\approx} y_1^{1 + (\alpha_1 - 1) + (\alpha_2 - 1)} y_2^{\alpha_1-1} (1-y_2)^{\alpha_2-1} e^{\frac{y_1y_2}{\beta} + \frac{y_1(1-y_2)}{\beta}} = (y_1^{\alpha_1 + \alpha_2 - 1} e^{\frac{y_1}{\beta}}) (y_2^{\alpha_1-1} (1-y_2)^{\alpha_2-1}) \approx f_{Y_1}(y_1) f_{Y_2}(y_2) $$
(2) we can forget about the constants - everything must integrate to one so they will coincide.
So $f_Y(y_1,y_2) = f_{Y_1}(y_1) f_{Y_2}(y_2)$ ie they are independent.