Why does the regressor '1/t' in y(t) = B1 + B2(1/t) + u tend to zero?

17 Views Asked by At

My textbook says that if n tends to infinity then each observation provides less and less information about B2. This happens due to the fact that (1/t) tends to zero and hence varies less and less across observations as t becomes larger. Can someone explain this?