Consider why does bootstrap sampling distribution need to compute the estimate, If we knew the true parameters $\theta^∗$?

27 Views Asked by At

Page 192 of "Kevin Patrick Murphy. Machine Learning: A Probabilistic Perspective." says

The bootstrap is a simple Monte Carlo technique to approximate the sampling distribution. This is particularly useful in cases where the estimator is a complex function of the true parameters.

The idea is simple. If we knew the true parameters $\theta^∗$ ...

If we already know the true parameters $\theta^∗$, why do I need to compute the estimate?

1

There are 1 best solutions below

0
On BEST ANSWER

As stated in the first sentence of your quote, the purpose of the bootstrap is not to compute the estimator, but rather to approximation the sampling distribution of the estimator.

Regarding the next paragraph, you should keep reading. The quote

If we knew the true parameters $\theta^*$...

describes how you would approximate the sampling distribution in the idealistic situation where you do know the parameters (which is not the case, and, as you point out, would make the entire estimation problem pointless), but then follows it up with

Since $\theta^*$ is unknown, we do generate samples using $\hat{\theta}(\mathcal{D}$) instead.

which describes what you would actually do.