Solution to minimization problem when parameter goes to infinity

44 Views Asked by At

I have a time series with $T$ periods, $\{y_1, y_2, ..., y_T\}$. I need to minimize the following expression:

$\min_{\{\tau_t\}} \sum_{t=1}^T (y_t - \tau_t)^2 + \lambda \sum_{t=2}^{T-1} ((\tau_{t+1} - \tau_t) - (\tau_t - \tau_{t-1}))^2$

What is $\tau_t^*$ if $\lambda \to \infty$?

My intuition is that as $\lambda$ goes to infinity, minimizing the second term becomes the priority. Since the second term is the difference between two consecutive rates of change, then the solution should tend towards a constant rate of change.