This repository contains a modified implementation of the Informer model, used to experiment with the long sequence time-series prediction of financial factor data.
- Apply Informer model to financial time-series data (Fama-French factors)
- Benchmark performance against persistent and zero baselines
- Identify which factors exhibit the most predictable structure
- Evaluate multistep forecasting with a variety of architectural modifications, and hyperparameter settings
git clone https://github.com/mmocklin18/finformer.git
cd finformerpython3 -m venv .venv
source .venv/bin/activatepip install -r requirements.txtpython -u main_informer.py \
--model informer \
--data custom \
--root_path ./data/ \
--data_path factors_data.csv \
--features S \
--target UMD \
--freq d \
--seq_len 60 \
--label_len 30 \
--pred_len 10 \
--enc_in 1 \
--dec_in 1 \
--c_out 1 \
--e_layers 2 \
--d_layers 1 \
--n_heads 4 \
--d_model 128 \
--d_ff 256 \
--dropout 0.1 \
--train_epochs 10 \
--batch_size 16 \
--learning_rate 0.001 \
--loss mse \
--gpu 2 \
--itr 5For a complete list of command-line arguments and configuration options, refer to the original Informer repository
@article{haoyietal-informerEx-2023,
author = {Haoyi Zhou and
Jianxin Li and
Shanghang Zhang and
Shuai Zhang and
Mengyi Yan and
Hui Xiong},
title = {Expanding the prediction capacity in long sequence time-series forecasting},
journal = {Artificial Intelligence},
volume = {318},
pages = {103886},
issn = {0004-3702},
year = {2023},
}
@inproceedings{haoyietal-informer-2021,
author = {Haoyi Zhou and
Shanghang Zhang and
Jieqi Peng and
Shuai Zhang and
Jianxin Li and
Hui Xiong and
Wancai Zhang},
title = {Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting},
booktitle = {The Thirty-Fifth {AAAI} Conference on Artificial Intelligence, {AAAI} 2021, Virtual Conference},
volume = {35},
number = {12},
pages = {11106--11115},
publisher = {{AAAI} Press},
year = {2021},
}
This project builds on the original Informer implementation by Zhou et al. (AAAI 2021), used under the MIT License.