Given a model $$Y_t =b_0 + b_1 \cdot X_t + b_2 \cdot Z_t + e_t,$$ where the error term $e_t$ follows a random walk form of serial correlation $e_t = e_{t-1} + u_t$. Further assume $u_t$ has zero mean and a constant variance and zero covariance with $u_s$ where $s \neq t$.
How to transform a random walk serial correlation so that we can use the same data to estimate a model that satisfies Gauss-Markov assumptions? In other words, how can we transform the above model to get OLS estimates?
How about first differencing? This yields $$ Y_t-Y_{t-1}=(b_0-b_0)+b_1(X_t-X_{t-1})+b_2(Z_t-Z_{t-1})+((e_{t-1}+u_t)-e_{t-1})\\ \Delta Y_t=b_1\Delta X_t+b_2\Delta Z_t+u_t $$ which you might find useful.