I think i do right in step1 above. But i wonder whether i get the right confidence interval of mu and sigma in step2?
2025-01-12 23:58:11.1736726291
Do i get the right MLE and 90% confidence interval of normal distribution?
309 Views Asked by 周庆特 https://math.techqa.club/user/zhou-qing-te/detail At
1
There are 1 best solutions below
Related Questions in STATISTICS
- Finding the value of a constant given a probability density function
- How to find probability after finding the CDF of a max?
- Is the row space of a matrix (order n by m, m < n) of full column rank equal to $\mathbb{R}^m$?
- Rate of growth of expected value of maxima of iid random variables under minimal assumptions
- Lower bound for the cumulative distribution function on the left of the mean
- Hypergeometric Probability
- $\mathbb E[(\frac{X+1}{4}-\theta)^2]=?$
- What value of $\alpha$ makes $\sum_{i=0}^n (x_i-\alpha)^2$ minimum?
- The distribution of fourier coefficients of a Rademacher sequence
- small sample (1 point) MLE estimation
Related Questions in STATISTICAL-INFERENCE
- Finding the value of a constant given a probability density function
- Non-unique Bayes rules admissibility
- Asymptotically unbiasedness of an weighted estimator
- Finding an approximation for $E[Y ] − e^{\mu}$?
- UMP for $f(y)=\theta y^{\theta - 1}$ when $H_0: \theta = 1$
- C.I. for $Bin(n,p)$ using pivots
- Bayesian Inference and Disease Testing
- moment generating function for folded/absolute normal distribution
- Calculating the Standard Error for a one sample T-test: σ/sqrt(n) or s/sqrt(n)?
- UMVUE for $\theta^2$
Related Questions in STATISTICAL-MECHANICS
- definition of conditional probability $(p_i|\pi_k)$ and Tsallis entropy
- How to expand a factorial expression with $N$ and $m$
- Uniqueness of Gibbs measure for rotator model in one dimension
- How are canonical and grand canonical ensemble related in the framework of large deviations theory?
- Number of a self avoiding cycles in 2D with a given area
- Number of connected sets of size $k$ containing a given vertex on $\mathbb{Z}^d$
- Number of connected subsets of a finite connected set in $\mathbb{Z}^d$
- Proof that the q-Gaussian is derived by maximizing Tsallis entropy
- How to prove that the Statistical Entropy $S_{BG}$ is concave
- An intuitive explanation of how the mathematical definition of ergodicity implies the layman's interpretation 'all microstates are equally likely'.
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Refuting the Anti-Cantor Cranks
- Find $E[XY|Y+Z=1 ]$
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- What are the Implications of having VΩ as a model for a theory?
- How do we know that the number $1$ is not equal to the number $-1$?
- Defining a Galois Field based on primitive element versus polynomial?
- Is computer science a branch of mathematics?
- Can't find the relationship between two columns of numbers. Please Help
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- A community project: prove (or disprove) that $\sum_{n\geq 1}\frac{\sin(2^n)}{n}$ is convergent
- Alternative way of expressing a quantied statement with "Some"
Popular # Hahtags
real-analysis
calculus
linear-algebra
probability
abstract-algebra
integration
sequences-and-series
combinatorics
general-topology
matrices
functional-analysis
complex-analysis
geometry
group-theory
algebra-precalculus
probability-theory
ordinary-differential-equations
limits
analysis
number-theory
measure-theory
elementary-number-theory
statistics
multivariable-calculus
functions
derivatives
discrete-mathematics
differential-geometry
inequality
trigonometry
Popular Questions
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- How to find mean and median from histogram
- Difference between "≈", "≃", and "≅"
- Easy way of memorizing values of sine, cosine, and tangent
- How to calculate the intersection of two planes?
- What does "∈" mean?
- If you roll a fair six sided die twice, what's the probability that you get the same number both times?
- Probability of getting exactly 2 heads in 3 coins tossed with order not important?
- Fourier transform for dummies
- Limit of $(1+ x/n)^n$ when $n$ tends to infinity
You inquire specifically about the confidence intervals. It is worthwhile to distinguish several cases:
(1) $\mu$ is unknown and $\sigma$ is known. We seek a CI for $\mu$. Then $\bar X$ is the MLE of $\mu$ with $$\frac{\bar X - \mu}{\sigma/\sqrt{n}} \sim Norm(0, 1).$$ Then a 95% CI for $\mu$ is $\bar X \pm 1.96\sigma/\sqrt{n}.$
(2) $\mu$ and $\sigma$ are both unknown. We seek a CI for $\mu$. Then $\bar X$ is the MLE of $\mu$ and $S = \sqrt{\frac{\sum_{i=1}^n(X_i - \bar X)^2}{n-1}}$ is the MLE of $\sigma.$ Then a 95% CI for $\mu$ is $\bar X \pm t^*S/\sqrt{n},$ where $t^*$ cuts 2.5% from the upper tail of Student's t distribution with $n - 1$ degrees of freedom.
(3) $\sigma$ is unknown and $\mu$ is known. We seek a CI for $\sigma.$ Then $nV/\sigma^2 \sim Chisq(df = n),$ where $V = (1/n)\sum_{i=1}^n (X_i - \mu)^2.$ This relationship can be used to get a 95% CI for $\sigma^2.$ Take square roots of both endpoints to get a CI for $\sigma.$
(4) $\sigma$ and $\mu$ are both unknown. We seek a CI for $\sigma.$ Then $(n-1)S^2/\sigma^2 \sim Chisq(n-1).$ This relationship can be used to get a 95% CI for $\sigma^2.$ Take square roots of both endpoints to get a CI for $\sigma.$
(5) $\sigma$ and $\mu$ are both unknown. We seek a 2-dimensional confidence region for both parameters simultaneously. Because $\bar X$ and $S^2$ (based on normal data) are independent random variables, we can use the ideas of (2) and (4) to get a region in $(\mu, \sigma^2)$-space that is a 95% confidence region for the two parameters simultaneously. Details are not difficult, and are shown (among other places) in Mood/Graybill/Boes, 3e (1974) page 384. This region (bounded by a horizontal strip and a parabola) does not have minimum area for a fixed confidence level. Roughly speaking, its area can be slightly reduced by making a roughly elliptical region that "rounds the corners" of the more elementary region--without changing its coverage probability.