I am trying to study EM algorithm and Maximum Likelihood Estimation. Somehow, they both sound the same to me but can't really say the difference. Maybe I don't really understand any of them. I have just started.
Can somebody tell me what they do and the relationship or difference between them?
As per my understanding
EM
Assumption: The given data is coming/originated from multiple classes/distribution
Input: Data points
Output: Each data point classified to a distribution and the parameters of the distribution.
MLE
Assumption: The distribution from which the data points originate
Input: Data points
Output: Parameters of the distribution which maximizes the probability that the data points have originated from that particular distribution.
If this is correct, then EM might be doing MLE as an intermediate step.
Please correct me if I am wrong.