This question is threefold.
We have an LTI system that is a first degree Butterworth LP filter with the power TF

where fu = 110Hz and f1 = 90Hz
The input X(t) has the autocorrelation: R_X(\tau) = 5e^{-600|\tau|}
1) How can I calculate the power spectral density of the output in MATLAB? FFT? How do I represent the autocorrelation as a vector?
2) How can I simulate the system and plot the output in MATLAB?
3) I am also supposed to write a matlab function that returns samples of the time-domain function as $X(n*\Delta t)$, $n=0,1,..,N$ where $\Delta t = 1/f_s$ and $f_s = 2000Hz$.
This isn't a complete answer, but it will help you get started. To get the power spectral density of the output, find the Fourier transform of $R_x(\tau)$ and multiply it by $\vert H(f)\vert^2$. I'm not sure what you mean by representing the autocorrelation as a vector. It's a function of $\tau$, so you could create a vector of autocorrelation values by just evaluating it at a series of values of $\tau$. For instance,
To simulate, suppose that
xis the function you are interested in and that it takes in a value for time and returns the value of the function, something like this (of course 'x' is a horrible name for a function, but this is all for illustration purposes):Then we could see what the filter does with something like this:
Double check the above for errors though since I haven't actually run it.