Transformers for Limit Order Books

Posted: 25 Mar 2020

Date Written: February 28, 2020


We introduce a new deep learning architecture for predicting price movements from limit order books. This architecture uses a causal convolutional network for feature extraction in combination with masked self-attention to update features based on relevant contextual information. This architecture is shown to significantly outperform existing architectures such as those using convolutional networks (CNN) and Long-Short Term Memory (LSTM) establishing a new state-of-the-art benchmark for the FI-2010 dataset.

Keywords: times series, neural networks, deep learning, attention, transformer

JEL Classification: C45, C15, C50, C53, C6, C63, G00, G10

Suggested Citation

Wallbridge, James, Transformers for Limit Order Books (February 28, 2020). Available at SSRN:

Do you have negative results from your research you’d like to share?

Paper statistics

Abstract Views
PlumX Metrics