# Meta-Learning for Time Series Forecasting (DeepTime) in PyTorch Lightning

Original Source Here

# Meta-Learning for Time Series Forecasting (DeepTime) in PyTorch Lightning

This article is devoted to describing a new type of deep learning model to cope with the usual problems in time series (covariate shift and conditional distribution shift)-come from being non-stationary-by using meta-learning formulation to forecast the future. The model is named DeepTime, which is a deep time-index model combined with using meta-learning. This model is a great example of the model’s synergies with a meta-learning formulation for time series forecasting.

# DeepTime in one Figure

In General, you can see there are three types of layers in DeepTime:

1. Ridge Regressor
2. Multi-layer Perceptron (MLP)
3. Random Fourier Features

Let’s see what these are layer doing:

## 2. MLP (multi-layer perceptron)

Well, nothing new!! These are the linear regression formula that we use in simple (and, of course, powerful) Neural-Networks (NNs). After each, we used a ReLU function. The point is that these layers are an excellent fit for mapping a time index to the value of the time series at that time index. The formula is described below:

## 3. Random Fourier

The Random Fourier allows MLPs to learn high-frequency patterns. Though the Random Fourier layer suffers from operating a hyperparameter for each task and dataset (just to not be overfitted or under fitted), the contributors limit this computation by combining various Fourier basis functions with various scale parameters.

# DeepTIME in one picture

As you can easily see, in each task, we pick a time series sequence and then divide it into two parts known as the backbone window (green) and the forecast window (blue). Then they go through two meta-models associated with the meta parameters, which share information with each other. After training the model on the architecture described in PICTURE 2, we calculate the loss function and try to minimize it.

That’s all. 🙂

# What makes DeepTIME distinguished from other time series forecasting models?

Well, the most obvious one is DeepTIME is a Time-index model just like Prophet, Gaussian process and etc.; However, the recent outstanding models like N-HiTS, Autoformer, DeepAR, Informer, and etc. are Historical-value models. Ahhhh… what does it mean?? Keep up

When we say Time-index model for time series, we exactly mean that the forecasting absolutely varies by time (it considers current time-index features). On the other hand, Historical-value models use previous events to forecast the future. Did you get it?? If not, maybe the formulation can make it clearer for you. 🙂

It is equipped with a meta-learning formulation, which means learning to learn (you can read my previous article about its fundamentals).

As it is a time-index model, it shows better sample efficiency in meta-learning.

It takes the direct multistep (DMS) approach. Whattt..!! Ok DMS models directly forecast several data points at once (the horizon or forecast window or …). On the other hand, we have IMS (Iterative Multi-Step), which only predicts one next value and then uses it to forecast the next data point and so on; the most popular representations are ARIMA, DeepAR and etc.

# What does meta-learning bring to time series forecasting?

• The capacity to conform to the assumption that nearby time steps follow a locally stationary distribution.
• Have a stronger assumption that similar time points will have similar characteristics.

# How the model is forecasting!!

Well, as you read previously, in each episode, we divide the data into two windows (forecasting the second one by using the first one).

# DeepTIME Trainer in PyTorch Lightning

It’s not that of a big deal. You can simply take the official doc on the trainer and deploy your models with PyTorch lighting. There are only some simple steps to transform your PyTorch model to PyTorch lightning. I’m not going to say what PyTorch lightning is, just to prevent it from getting longer and longer, So let’s get to the point straight ahead. (😉)

We can add more modifications to the trainer, for e.g., what metrics you want to be logged or etc.

You can have a look at GitHub just to run and use DeepTIME in PyTorch Lightning in your use (In case you want to have a look at the main code of the model you can visit here). If there was any question, I’d be glad to answer your question.

AI/ML

Trending AI/ML Article Identified & Digested via Granola by Ramsey Elbasheer; a Machine-Driven RSS Bot