Page 192 of "Kevin Patrick Murphy. Machine Learning: A Probabilistic Perspective." says
The bootstrap is a simple Monte Carlo technique to approximate the sampling distribution. This is particularly useful in cases where the estimator is a complex function of the true parameters.
The idea is simple. If we knew the true parameters $\theta^∗$ ...
If we already know the true parameters $\theta^∗$, why do I need to compute the estimate?
As stated in the first sentence of your quote, the purpose of the bootstrap is not to compute the estimator, but rather to approximation the sampling distribution of the estimator.
Regarding the next paragraph, you should keep reading. The quote
describes how you would approximate the sampling distribution in the idealistic situation where you do know the parameters (which is not the case, and, as you point out, would make the entire estimation problem pointless), but then follows it up with
which describes what you would actually do.