Transformers: State-of-the-art natural language processing T Wolf, L Debut, V Sanh, J Chaumond, C Delangue, A Moi, P Cistac, ... Proceedings of the 2020 conference on empirical methods in natural language …, 2020 | 14885* | 2020 |
DistilBERT, A Distilled Version of BERT: Smaller, Faster, Cheaper and Lighter V Sanh arXiv preprint arXiv:1910.01108, 2019 | 7513 | 2019 |
Datasets: A community library for natural language processing Q Lhoest, AV Del Moral, Y Jernite, A Thakur, P Von Platen, S Patil, ... arXiv preprint arXiv:2109.02846, 2021 | 238 | 2021 |
Peft: State-of-the-art parameter-efficient fine-tuning methods S Mangrulkar, S Gugger, L Debut, Y Belkada, S Paul, B Bossan URL: https://github. com/huggingface/peft, 2022 | 205 | 2022 |
Distilbert, a distilled version of BERT: smaller, faster, cheaper and lighter. CoRR abs/1910.01108 (2019) V Sanh, L Debut, J Chaumond, T Wolf URL: http://arxiv. org/abs/1910 1108, 1910 | 110 | 1910 |
Huggingface’s transformers: State-of-the-art natural language processing. arXiv T Wolf, L Debut, V Sanh, J Chaumond, C Delangue, A Moi, P Cistac, ... arXiv preprint arXiv:1910.03771, 2019 | 76 | 2019 |
Platen Patrick von W Thomas, D Lysandre, S Victor, C Julien, D Clement, M Anthony, ... Ma Clara, Jernite Yacine, Plu Julien, Xu Canwen, Scao Teven Le, Gugger …, 2020 | 41 | 2020 |
Huggingface’s transformers: state-of-the-art natural language processing. CoRR abs/1910.03771 (2019) T Wolf, L Debut, V Sanh, J Chaumond, C Delangue, A Moi, P Cistac, ... URL: http://arxiv. org/abs/1910.03771, 1910 | 39 | 1910 |
Accelerate: Training and inference at scale made simple, efficient and adaptable G Sylvain, D Lysandre, W Thomas, S Philipp, M Zachary, M Sourab | 5 | 2022 |
Datasets T Wolf, Q Lhoest, P von Platen, Y Jernite, M Drame, J Plu, J Chaumond, ... GitHub. Note: https://github. com/huggingface/datasets 1, 2020 | 5 | 2020 |
Benchmarking Transformers: PyTorch and TensorFlow L Debut | 2 | 2019 |