Multivariate time series and machine learning.

81 Views Asked by At

I have a question relate to apply machine learning algorithm to time series data. Because time series data has the impact of "order time or sequence" (I am meaning that time indexing, for example: current sale was impacted by previous sales), so how can machine learning handle the "order/sequence" in time series? I explored the method that transform time series data to supervised learning. But how can we do in case we have many feature (For ex: forecast sale with feature such as sale, promotion, holiday...)?

1

There are 1 best solutions below

5
On

What one usually does is include lags of variables as regressors. Suppose I want to to predict $y_t$ and I have additional variables $x_t$. Then we could perform a regression of $y_t$ on the variables $x_t, x_{t-1},..., x_{t-p}, y_{t-1},..., y_{t-p}$, for some appropriately chosen $p$. In addition, we can also include time itself as a variable.