Skip to content

Forecasting open-high-low-close (OHLC) data with Transformer models

License

Notifications You must be signed in to change notification settings

niksyromyatnikov/OHLCFormer

Repository files navigation

Codacy grade GitHub License GitHub last commit GitHub Repo stars GitHub watchers

Neural networks training and evaluation tool for open-high-low-close (OHLC) data forecasting.

OHLCFormer provides an easy-to-use API for model prototyping, training, and evaluation to perform open-high-low-close (OHLC) data forecasting tasks.

Getting started

You can find here a list of the official notebooks.

Notebook Description
Data processing How to preprocess your data and build a dataset. Open in Colab
Model training How to set up and train a model on your dataset. Open in Colab
Models evaluation How to benchmark models with OHLCFormer. Open in Colab
Logging How to set up and use the logging components. Open in Colab

Model architectures

OHLCFormer currently provides the following architectures:

  1. FNet (from Google Research) released with the paper FNet: Mixing Tokens with Fourier Transforms by James Lee-Thorp, Joshua Ainslie, Ilya Eckstein, Santiago Ontanon.
  2. BERT (from Google) released with the paper BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding by Jacob Devlin, Ming-Wei Chang, Kenton Lee and Kristina Toutanova.