I'm new to statistics and I'm a bit confused about the concepts of 'chance of a variable' and 'parameters of a probability distribition'. Is chance also a parameter? And if so: can computing the chance of a variable be considered as an estimation for the parameters?
2026-03-30 16:24:23.1774887863
Is the chance of a variable also a parameter for a probability distribution?
161 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in STATISTICS
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- Statistics based on empirical distribution
- Given $U,V \sim R(0,1)$. Determine covariance between $X = UV$ and $V$
- Fisher information of sufficient statistic
- Solving Equation with Euler's Number
- derive the expectation of exponential function $e^{-\left\Vert \mathbf{x} - V\mathbf{x}+\mathbf{a}\right\Vert^2}$ or its upper bound
- Determine the marginal distributions of $(T_1, T_2)$
- KL divergence between two multivariate Bernoulli distribution
- Given random variables $(T_1,T_2)$. Show that $T_1$ and $T_2$ are independent and exponentially distributed if..
- Probability of tossing marbles,covariance
Related Questions in PROBABILITY-DISTRIBUTIONS
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- Statistics based on empirical distribution
- Given $U,V \sim R(0,1)$. Determine covariance between $X = UV$ and $V$
- Comparing Exponentials of different rates
- Linear transform of jointly distributed exponential random variables, how to identify domain?
- Closed form of integration
- Given $X$ Poisson, and $f_{Y}(y\mid X = x)$, find $\mathbb{E}[X\mid Y]$
- weak limit similiar to central limit theorem
- Probability question: two doors, select the correct door to win money, find expected earning
- Calculating $\text{Pr}(X_1<X_2)$
Related Questions in PARAMETER-ESTIMATION
- Question on completeness of sufficient statistic.
- Estimate the square root of the success probability of a Binomial Distribution.
- A consistent estimator for theta is?
- Estimating the mean of a Poisson distribution
- A problem on Maximum likelihood estimator of $\theta$
- The Linear Regression model is computed well only with uncorrelated variables
- Derive unbiased estimator for $\theta$ when $X_i\sim f(x\mid\theta)=\frac{2x}{\theta^2}\mathbb{1}_{(0,\theta)}(x)$
- Is there an intuitive way to see that $\mathbb{E}[X|Y]$ is the least squares estimator of $X$ given $Y$?
- Consistent estimator for Poisson distribution
- estimation of $\mu$ in a Gaussian with set confidence interval
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Maybe you are new enough to statistics that you've only thought about normal and binomial distributions.
BINOMIAL. A binomial random variable $X$ counts the number of Successes in a simple experiment. Perhaps $X$ is the number of times you get a 6 when you roll a die twice. There are two parameters of the distribution of this random variable. The first is $n = 2$, the number of independent 'trials' (here rolls of the die). The second is the probability $p = 1/6,$ which is the probability (or 'chance') that you get a six on any one roll. But it is not ALWAYS the case that a parameter is the 'chance' of anything.
From this information we can derive the distribution of $X$, which gives probabilities $P(X = 0) = (5/6)^2 = 25/36,\; P(X = 1) = 2(1/6)(5/6) = 10/36$ and $P(X = 2) = (1/6)^2 = 1/36.$ Notice that summing the probabilities of each of the possible values of $X$ must always give 1. Here, we have $10/36 + 25/36 + 1/36 = 1.$ One can prove that the mean $\mu$ of this distribution is $1/3$ and that the standard deviation is $\sigma = 0.6202.$
NORMAL. A normal random variable $Y$ also has two parameters, the mean $\mu$ (or center) of the distribution, and the standard deviation $\sigma$, which says how spread out the values are likely to be.
This is a continuous random variable, so it is not possible to list the values it will take. Instead of a list of individual probabilities, we give a density function. The total area under the density function is is 1. For example, if $\mu = 100$ and $\sigma = 15$ then it turns out that $P(Y > 100) = 1/2 = 50\%$; half of the area under the density curve is to the right of 1/2. Also, $P(Y > 130) = 0.02275$ (just under $2.5\%$). (Perhaps this random variable models IQ scores; there are not many people with IQs above 130.)
In this case the mean and standard deviation are the same as the parameters, but neither of them has any direct interpretation as the 'chance' of anything.
MORE GENERAL. As you see additional random variables and their distributions, it is natural to try to see if the parameters have any intuitive interpretation. Often the answer is YES for the most commonly-used distributions, and it is worthwhile to try to understand the intuitive connection. It may have to do with a 'chance', a 'rate', something to do with the shape of the distribution, or something else. There is no general rule for this. (Also, sometimes the parameters are very abstract and any kind of intuitive interpretation is difficult.)
Below are plots of the two distributions mentioned as examples above. A vertical red line shows the location of the mean in each case.