In Casella and Berger's textbook Statistical Inference Definition 1.3.2, when talking about conditional probability $P(A|B)=\frac{P(A\cap B)}{P(B)}$, $A$ and $B$ are assumed to be two events in the original sample space $S$, and thus they are both elements of the original sigma algebra $\mathcal{B}$. My question is why we do not restrict $A$ to be a set that is an element of a new sigma algebra containing only subsets of $B$? Given that we have updated our sample space from $S$ to $B$, this seems natural.
2026-05-14 17:48:47.1778780927
When we talk about conditional probability, do we usually update the sigma-algebra too?
80 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY-THEORY
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Another application of the Central Limit Theorem
- proving Kochen-Stone lemma...
- Is there a contradiction in coin toss of expected / actual results?
- Sample each point with flipping coin, what is the average?
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Determine the marginal distributions of $(T_1, T_2)$
- Convergence in distribution of a discretized random variable and generated sigma-algebras
Related Questions in MEASURE-THEORY
- On sufficient condition for pre-compactness "in measure"(i.e. in Young measure space)
- Absolutely continuous functions are dense in $L^1$
- I can't undestand why $ \{x \in X : f(x) > g(x) \} = \bigcup_{r \in \mathbb{Q}}{\{x\in X : f(x) > r\}\cap\{x\in X:g(x) < r\}} $
- Trace $\sigma$-algebra of a product $\sigma$-algebra is product $\sigma$-algebra of the trace $\sigma$-algebras
- Meaning of a double integral
- Random variables coincide
- Convergence in measure preserves measurability
- Convergence in distribution of a discretized random variable and generated sigma-algebras
- A sequence of absolutely continuous functions whose derivatives converge to $0$ a.e
- $f\in L_{p_1}\cap L_{p_2}$ implies $f\in L_{p}$ for all $p\in (p_1,p_2)$
Related Questions in CONDITIONAL-PROBABILITY
- Given $X$ Poisson, and $f_{Y}(y\mid X = x)$, find $\mathbb{E}[X\mid Y]$
- Finding the conditional probability given the joint probability density function
- Easy conditional probability problem
- Conditional probability where the conditioning variable is continuous
- probability that the machine has its 3rd malfunction on the 5th day, given that the machine has not had three malfunctions in the first three days.
- Sum of conditional probabilities equals 1?
- Prove or disprove: If $X | U$ is independent of $Y | V$, then $E[XY|U,V] = E[X|U] \cdot E[Y|V]$.
- Conditional probability and binomial distribution
- Intuition behind conditional probabilty: $P(A|B)=P(B\cap A)/P(B)$
- Transition Probabilities in Discrete Time Markov Chain
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
geometry
circles
algebraic-number-theory
functions
real-analysis
elementary-set-theory
proof-verification
proof-writing
number-theory
elementary-number-theory
puzzle
game-theory
calculus
multivariable-calculus
partial-derivative
complex-analysis
logic
set-theory
second-order-logic
homotopy-theory
winding-number
ordinary-differential-equations
numerical-methods
derivatives
integration
definite-integrals
probability
limits
sequences-and-series
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
What you're suggesting is effectively to define $\ P(A\,|\,B)\ $ only for sets $\ A\ $ lying in the sigma algebra $\ \{\,S\cap B\,|\,S\in\mathcal{B}\,\}\ $. One reason why Casella and Berger would not do this is that it would vitiate some of the applications of conditional probabilities they make use of later in the book—most notably Bayesian hypothesis testing in section $8.2.2$ (p.$379$ of the second edition).
Bayesian hypothesis testing relies on a method of updating the relative probabilities of various hypotheses $\ H_1, H_2,\dots\ $, regarded as events in the sigma algebra $\ \mathcal{B}\ $, in the light of the fact that some other event $\ X\ $ in $\ \mathcal{B}\ $ is known to have occurred. The formula for the updated probability of $\ H_i\ $ relative to that of $\ H_j\ $ is $$ \frac{P\big(\,H_i\,\big|\,X\big)}{P\big(\,H_j\,\big|\,X\big)}=\frac{P\big(X\,\big|\,H_i\big)P\big(H_i\big)}{P\big(\,X\,\big|\,H_j\big)P\big(H_j\big)}\ , $$ which follows from the almost trivial fact that both sides of the identity are equal to $\ \frac{P(H_i\cap X)}{P(H_j\cap X)}\ $. If this formula is to be of any use, however, the first argument in the conditional probabilities must not be restricted in the way you have suggested.