W-Transformers : A Wavelet-based Transformer Framework for Univariate Time Series Forecasting

Fiche du document

Date

8 septembre 2022

Type de document
Périmètre
Identifiants
Collection

arXiv

Organisation

Cornell University



Sujets proches En

Hours (Time)

Citer ce document

Lena Sasal et al., « W-Transformers : A Wavelet-based Transformer Framework for Univariate Time Series Forecasting », arXiv - économie, ID : 10.1109/ICMLA55696.2022.00111


Métriques


Partage / Export

Résumé 0

Deep learning utilizing transformers has recently achieved a lot of success in many vital areas such as natural language processing, computer vision, anomaly detection, and recommendation systems, among many others. Among several merits of transformers, the ability to capture long-range temporal dependencies and interactions is desirable for time series forecasting, leading to its progress in various time series applications. In this paper, we build a transformer model for non-stationary time series. The problem is challenging yet crucially important. We present a novel framework for univariate time series representation learning based on the wavelet-based transformer encoder architecture and call it W-Transformer. The proposed W-Transformers utilize a maximal overlap discrete wavelet transformation (MODWT) to the time series data and build local transformers on the decomposed datasets to vividly capture the nonstationarity and long-range nonlinear dependencies in the time series. Evaluating our framework on several publicly available benchmark time series datasets from various domains and with diverse characteristics, we demonstrate that it performs, on average, significantly better than the baseline forecasters for short-term and long-term forecasting, even for datasets that consist of only a few hundred training samples.

document thumbnail

Par les mêmes auteurs

Sur les mêmes sujets

Sur les mêmes disciplines

Exporter en