Transformers in Time-Series Analysis: A Tutorial

Sabeen Ahmed, Ian E. Nielsen, Aakash Tripathi, Shamoon Siddiqui, Ravi P. Ramachandran, Ghulam Rasool

Research output: Contribution to journalArticlepeer-review

47 Scopus citations

Abstract

Transformer architectures have widespread applications, particularly in Natural Language Processing and Computer Vision. Recently, Transformers have been employed in various aspects of time-series analysis. This tutorial provides an overview of the Transformer architecture, its applications, and a collection of examples from recent research in time-series analysis. We delve into an explanation of the core components of the Transformer, including the self-attention mechanism, positional encoding, multi-head, and encoder/decoder. Several enhancements to the initial Transformer architecture are highlighted to tackle time-series tasks. The tutorial also provides best practices and techniques to overcome the challenge of effectively training Transformers for time-series analysis.

Original languageEnglish (US)
Pages (from-to)7433-7466
Number of pages34
JournalCircuits, Systems, and Signal Processing
Volume42
Issue number12
DOIs
StatePublished - Dec 2023

All Science Journal Classification (ASJC) codes

  • Signal Processing
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Transformers in Time-Series Analysis: A Tutorial'. Together they form a unique fingerprint.

Cite this