Can somebody tell me a clear difference between MLE from the usual probability inferences?
2026-03-29 10:18:24.1774779504
what is the difference between maximum likelihood estimation and usual probability inference?
388 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in STATISTICS
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- Statistics based on empirical distribution
- Given $U,V \sim R(0,1)$. Determine covariance between $X = UV$ and $V$
- Fisher information of sufficient statistic
- Solving Equation with Euler's Number
- derive the expectation of exponential function $e^{-\left\Vert \mathbf{x} - V\mathbf{x}+\mathbf{a}\right\Vert^2}$ or its upper bound
- Determine the marginal distributions of $(T_1, T_2)$
- KL divergence between two multivariate Bernoulli distribution
- Given random variables $(T_1,T_2)$. Show that $T_1$ and $T_2$ are independent and exponentially distributed if..
- Probability of tossing marbles,covariance
Related Questions in STATISTICAL-INFERENCE
- co-variance matrix of discrete multivariate random variable
- Question on completeness of sufficient statistic.
- Probability of tossing marbles,covariance
- Estimate the square root of the success probability of a Binomial Distribution.
- A consistent estimator for theta is?
- Using averages to measure the dispersion of data
- Confidence when inferring p in a binomial distribution
- A problem on Maximum likelihood estimator of $\theta$
- Derive unbiased estimator for $\theta$ when $X_i\sim f(x\mid\theta)=\frac{2x}{\theta^2}\mathbb{1}_{(0,\theta)}(x)$
- Show that $\max(X_1,\ldots,X_n)$ is a sufficient statistic.
Related Questions in ORDER-STATISTICS
- What does it mean to say that one random variable is "greater" than another?
- Fermi/Bose gases
- Calculating integral using pdf's $\int_0^1x^r(1-x)^{n-r}dx$
- Expectation of sample median, general uniform case
- Distribution of the maximum of independent RV
- Order statistics, unclear derivation of the distribution function, uniform df
- find joint distribution $\min (X,Y)$ and $W$, where W is a discrete distribution
- Prove that regression beta of order statistics converges to 1?
- Probability based on position
- Understanding functional equation for order statistics
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
The idea of MLE is that you construct a model with certain parameters. However, you do not know the true parameters of the distribution. Hence, you use a sample from the population to estimate the parameters. MLE is an estimation method. Here is an example, using the Bernoulli distribution. This time, you are the main character of the story.
"Let's play a game! Flip this coin. If it lands on heads, I win the bet and if it lands on tails, you win the bet. Let's bet ten dollars a round." A game master approached you with this proposal.
"How do I know if the coin you possess is fair?", you asked.
"Test it!"
With that, you receive the coin and attempt to test it. You know that the only possible outcomes are heads and tails, with a probability of, in this case, $p$. Denoting $1$ as heads and $0$ as tails, and letting $X$ be the random variable that the coin will output, you say
$$ X = \begin{cases}0 \text{ with a probability of } p\\1 \text{ with a probability of } 1-p\end{cases} $$
This the the formulation of the model.
Alternatively, if you look at the probability function of the distribution, $f_X(x) = p^x(1-p)^{1-x}$
Remember! At this point, you do not know the actual or true value of $p$. However, you shall attempt to estimate it. I shall elaborate here and not go to the derivation.
The idea right here is that you estimate the value of $p$ based on your sample because you do not know the true value. In this case, of course the MLE estimate of $p$ is $\bar x$ however, it is not just a sample mean by elementary statistics! $\bar x$ is the value of $p$ that maximizes the likelihood function. The likelihood function, although counter-intuitive, is the conditional probability that $p$ takes a certain value given the joint probability distribution function. Hence, you see the notation $L(p) = L(p | x_1, x_2, \cdots, x_n)$.
Consider you toss the coin 5 times and the sample values are $1,0,1,1,1$. The likelihood function is actually $$L(p) = p^4(1-p)$$. Attempt to derive this via the pdf of the distribution above.
What you are doing, in MLE, is to find the value of $p$ that maximizes this function (that is why you say that value maximizes the likelihood function)...because you want to find the most probable value of $p$ given that you observed this sample.
If you understand this, you can extend this idea to other distributions like the Poisson, or even continuous distributions like the uniform, Gamma or Normal (in this case you have to estimate 2 variables concurrently.)
Remember, in the real world, you do not have the luxury of sampling the whole population which sometimes extends millions or billions (the log returns of a stock might be updated several times a second..so imagine the number of entries you have if you analyze 10 years). Also, you do not know the value of the parameters. Beyond this trivial example, consider this:
You believe that travel time from your home to the city follows a Poisson distribution with mean $\lambda$ but you do not know the true value of $\lambda$ because it depends on the traffic each day. You want to estimate $\lambda$ so you can plan for your weekly fuel usage. In this case, do you know the true value of $\lambda$? If not, how would you do to estimate the value of $\lambda$?