The AutoGluon developers and community are committed to open source, and that extends to our research. Below you can find a list of our published work relating to AutoGluon along with guidelines for citing our work.
- AutoGluon-Tabular: Robust and Accurate AutoML for Structured Data (Arxiv, 2020) (BibTeX)
- Fast, Accurate, and Simple Models for Tabular Data via Augmented Distillation (NeurIPS, 2020) (BibTeX)
- Benchmarking Multimodal AutoML for Tabular Data with Text Fields (NeurIPS, 2021) (BibTeX)
- XTab: Cross-table Pretraining for Tabular Transformers (ICML, 2023)
- AutoGluon-TimeSeries: AutoML for Probabilistic Time Series Forecasting (AutoML Conf, 2023) (BibTeX)
- TabRepo: A Large Scale Repository of Tabular Model Evaluations and its AutoML Applications (AutoML Conf, 2024)
- AutoGluon-Multimodal (AutoMM): Supercharging Multimodal AutoML with Foundation Models (AutoML Conf, 2024) (BibTeX)
Please cite the core AutoGluon paper (AutoGluon-Tabular) as well as any module specific paper that is relevant.
If you use AutoGluon in a scientific publication, please cite the following paper (regardless of which module is being used):
- Erickson, Nick, et al. "AutoGluon-Tabular: Robust and Accurate AutoML for Structured Data." arXiv preprint arXiv:2003.06505 (2020).
BibTeX entry:
@article{agtabular,
title={AutoGluon-Tabular: Robust and Accurate AutoML for Structured Data},
author={Erickson, Nick and Mueller, Jonas and Shirkov, Alexander and Zhang, Hang and Larroy, Pedro and Li, Mu and Smola, Alexander},
journal={arXiv preprint arXiv:2003.06505},
year={2020}
}
If you use AutoGluon's multimodal functionality in a scientific publication, please cite the following paper:
Zhiqiang, Tang, et al. "AutoGluon-Multimodal (AutoMM): Supercharging Multimodal AutoML with Foundation Models", The International Conference on Automated Machine Learning (AutoML), 2024.
BibTeX entry:
@article{tang2024autogluon,
title={AutoGluon-Multimodal (AutoMM): Supercharging Multimodal AutoML with Foundation Models},
author={Tang, Zhiqiang and Fang, Haoyang and Zhou, Su and Yang, Taojiannan and Zhong, Zihan and Hu, Tony and Kirchhoff, Katrin and Karypis, George},
journal={arXiv preprint arXiv:2404.16233},
year={2024}
}
If you use AutoGluon's TextPredictor, please cite the following paper:
Shi, Xingjian, et al. "Benchmarking Multimodal AutoML for Tabular Data with Text Fields." Advances in Neural Information Processing Systems 35, Datasets and Benchmarks Track (2021).
@inproceedings{agmultimodaltext,
title={Benchmarking Multimodal AutoML for Tabular Data with Text Fields},
author={Shi, Xingjian and Mueller, Jonas and Erickson, Nick and Li, Mu and Smola, Alexander J},
journal={Advances in Neural Information Processing Systems Datasets and Benchmarks Track},
volume={35},
year={2021}
}
If you use AutoGluon's time series forecasting functionality in a scientific publication, please cite the following paper:
@inproceedings{agtimeseries,
title={{AutoGluon-TimeSeries}: {AutoML} for Probabilistic Time Series Forecasting},
author={Shchur, Oleksandr and Turkmen, Caner and Erickson, Nick and Shen, Huibin and Shirkov, Alexander and Hu, Tony and Wang, Yuyang},
booktitle={International Conference on Automated Machine Learning},
year={2023}
}
If you use the Chronos pretrained model, please cite:
@article{ansari2024chronos,
author = {Ansari, Abdul Fatir and Stella, Lorenzo and Turkmen, Caner and Zhang, Xiyuan, and Mercado, Pedro and Shen, Huibin and Shchur, Oleksandr and Rangapuram, Syama Syndar and Pineda Arango, Sebastian and Kapoor, Shubham and Zschiegner, Jasper and Maddix, Danielle C. and Mahoney, Michael W. and Torkkola, Kari and Gordon Wilson, Andrew and Bohlke-Schneider, Michael and Wang, Yuyang},
title = {Chronos: Learning the Language of Time Series},
journal = {arXiv preprint arXiv:2403.07815},
year = {2024}
}
If you are using AutoGluon Tabular's model distillation functionality, please cite the following paper:
Fakoor, Rasool, et al. "Fast, Accurate, and Simple Models for Tabular Data via Augmented Distillation." Advances in Neural Information Processing Systems 33 (2020).
BibTeX entry:
@article{agtabulardistill,
title={Fast, Accurate, and Simple Models for Tabular Data via Augmented Distillation},
author={Fakoor, Rasool and Mueller, Jonas W and Erickson, Nick and Chaudhari, Pratik and Smola, Alexander J},
journal={Advances in Neural Information Processing Systems},
volume={33},
year={2020}
}