Felix Draxler, Yang Meng, Kai Nelson, Lukas Laskowski, Yibo Yang, Theofanis Karaletsos, Stephan Mandt
This repository contains an unofficial implementation of the Transformer model for mixed-type event sequences as described in the paper.
@inproceedings{
draxler2025transformers,
title={Transformers for Mixed-type Event Sequences},
author={Felix Draxler and Yang Meng and Kai Nelson and Lukas Laskowski and Yibo Yang and Theofanis Karaletsos and Stephan Mandt},
booktitle={The Thirty-ninth Annual Conference on Neural Information Processing Systems},
year={2025},
url={https://openreview.net/forum?id=MtwsRjPZhf}
}Clone the repository:
git clone https://github.com/fdraxler/FlexTPP.git
cd FlexTPPWe use uv to manage the environment and handle dependencies. To set up the environment, run:
uv venv
uv syncYou can also install the project manually using pip, but make sure to activate your virtual environment first. Then, simply install our repository as a package:
pip install .Be sure to activate the environment before running any code:
source .venv/bin/activateWe provide the config files for all experiments in the paper in the configs/ folder. To run an experiment, use the following command:
python run.py --config-name <name-of-config-file-in-configs>For example, to run the experiment on the EasyTPP Amazon dataset, use:
python run.py --config-name easytpp_amazonIf you are not logged into Weights & Biases, you can pass trainer.logger.0.offline=True to the command line to log runs locally, or follow the instructions to log in.