HTML-Hierarchical-Transformer-based-Multi-task-Learning-for-Volatility-Prediction
If you find this repository help your research, please cite our following paper:
Linyi Yang, Tin Lok James Ng, Barry Smyth, Ruihai Dong. HTML: Hierarchical Transformer-based Multi-task Learning for Volatility Prediction. Proceedings of the The Web Conference 2020.
@inproceedings{yang2020html,
title={HTML: Hierarchical Transformer-based Multi-task Learning for Volatility Prediction},
author={Yang, Linyi and Ng, Tin Lok James and Smyth, Barry and Dong, Ruihai},
booktitle={Proceedings of The Web Conference 2020},
pages={441--451},
year={2020}
}
Dataset
The token-level transformer relies on the pre-trained transformers, which can be downloed from here.
The raw dataset of the earnings call can be found from [Qin and Yang, ACL-19].
Model
We provide our code and data used for the paper. Our HTML model consists with token-level transformer and sentence-level transformer which can be found at the Model path. Also, we provide our experimental code using Multi-task settings and Single-task settings respectively.
Contact
Any questions or queries feel free to email me at [email protected] -- Thanks for reading.