Maximum Likelihood Estimation: Transformation from Product-Of-Likelihoods to Sum-Of-Log-Likelihoods

1.2k Views Asked by At

I'm reading the Deep Learning Book by Goodfellow, Bengio and Courville. I'm struggling to follow an argument in Chapter 5.5 (pages 129-130);

Except from Ch 5.5 in the Deep Learning Book

How do they make the jump from equation 5.57 to 5.58? Have they implicitly taken a log of the result of the product, allowing them to use the log product expansion rule? I.e.

$$\log (ab)=\log(a)+\log(b)$$

Or is this a special case where the argmax allows for this transformation?

1

There are 1 best solutions below

0
On BEST ANSWER

Take the log of the product gives the sum of log $$\log (xy)=\log(x)+\log(y)$$ This is true since log is increasing function