Skip to content

czi-ai/FlexTPP

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Transformers for Mixed-Type Event Sequences

Felix Draxler, Yang Meng, Kai Nelson, Lukas Laskowski, Yibo Yang, Theofanis Karaletsos, Stephan Mandt

This repository contains an unofficial implementation of the Transformer model for mixed-type event sequences as described in the paper.

@inproceedings{
    draxler2025transformers,
    title={Transformers for Mixed-type Event Sequences},
    author={Felix Draxler and Yang Meng and Kai Nelson and Lukas Laskowski and Yibo Yang and Theofanis Karaletsos and Stephan Mandt},
    booktitle={The Thirty-ninth Annual Conference on Neural Information Processing Systems},
    year={2025},
    url={https://openreview.net/forum?id=MtwsRjPZhf}
}

Installation

Clone the repository:

git clone https://github.com/fdraxler/FlexTPP.git
cd FlexTPP

We use uv to manage the environment and handle dependencies. To set up the environment, run:

uv venv
uv sync

You can also install the project manually using pip, but make sure to activate your virtual environment first. Then, simply install our repository as a package:

pip install .

Be sure to activate the environment before running any code:

source .venv/bin/activate

Run experiments

We provide the config files for all experiments in the paper in the configs/ folder. To run an experiment, use the following command:

python run.py --config-name <name-of-config-file-in-configs>

For example, to run the experiment on the EasyTPP Amazon dataset, use:

python run.py --config-name easytpp_amazon

If you are not logged into Weights & Biases, you can pass trainer.logger.0.offline=True to the command line to log runs locally, or follow the instructions to log in.

About

Framework for easy mixed-type autoregressive models

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages