Consider a scenario in which an individual flips ten coins, and obtains the following result: $$\{HHHTTHHTHH\}$$
The probability for such a result to occur is given by, $$\pi^7(1-\pi)^3,$$ where $\pi$ is the probability of flipping heads. We can estimate $\pi$ using maximum likelihood estimation. We have that $$\frac{d\Pr}{d\pi}=7\pi^6(1-\pi)^3-3\pi^7(1-\pi)^2=\pi^6(1-\pi)^2(7-10\pi)=0.$$ Maximizing the probability we find our parameter to be $\pi=0.7$ (as one would expect). I understand that one can proceed similarly for a continuous distribution, such as the normal distribution (but with two parameters in the case of a normal distribution). However, I'm having some trouble finding examples of how to proceed in the case of a continuous probability density function. Examples would be appreciated.
Following up on the suggestion made by BruceET in a comment: Let $T_1,\ldots,T_n$ be i.i.d. from the exponential distribution $$ e^{-\lambda t}(\lambda\,dt) \quad \text{for } t\ge0. $$ Thus we have $\Pr(T_1>t) = e^{-\lambda t}$ for $t\ge0.$
Then the likelihood is $$ L(\lambda) = \prod_{i=1}^n (\lambda e^{-\lambda t_i}) = \lambda^n e^{-\lambda\sum_{i=1}^n t_i} $$ and then $$ \ell(\lambda) = \log L(\lambda) = n\log\lambda - \lambda\sum_{i=1}^n t_i, $$ so that $$ \ell\,'(\lambda) = \frac n \lambda - \sum_{i=1}^n t_i\quad \begin{cases} \ge0 & \text{if } 0\le\lambda\le \dfrac n {\sum_{i=1}^n t_i}, \\[8pt] \le0& \text{if } \phantom{0 \le {}} \lambda \ge \dfrac n {\sum_{i=1}^n t_i}. \end{cases} $$