I have a pdf defined as: $$f(x)=\lambda e^{-\lambda(x-b)}$$
Conduct a simulation study in R to explore the behaviour of the maximum likelihood estimator $λ_{MLE}$ for λ on simulated data $X_1, · · · , X_n$ (as independent copies of X with parameter λ) according to the following instructions. Take b = 0.01 and consider a setting in which λ = 2 and generate a plot of the mean squared error as a function of the sample size n. You should consider a sample size between 100 and 5000 in increments of 10, and consider 100 trials per sample size. For each trial of each sample size generate a random sample $X_1, · · · , X_n$ (as independent copies of X with parameter λ = 2), then compute the maximum likelihood estimate $λ_{MLE}$ for λ based upon the corresponding sample. Display a plot of the mean square error of $λ_{MLE}$ as an estimator for λ as a function of the sample size n.
I don't really understand how to go about the simulation study. I don't need an answer, I just need an outline on how to approach the simulation. Probably should use a dexp() function in R right? What do they mean by a 100 trials in each sample size. Please help!
Since you do not want a complete answer and just an outline to follow, you can proceed as below:
lambda_hat(sample)which takes a sample and computes its MLE for the parameter $\lambda$.Note that to compute the MSE of an estimator you need to compute the estimate a number of times and then take its average, which the instructions wants you to do that 100 times, and each time to compute the estimate you need a sample of a fixed size which comes from the vector
xbelow.x <- seq(100, 5000, by = 10).xbyy <- sapply(x, MSE).plot(x, y).