The \textit{Transformers} library is an open-source library that consists of carefully engineered state-of-the art Transformer architectures under a unified API and a curated collection of pretrained models made by and available for the community.
Recent progress in natural language processing has been driven by advances in both model architecture and model pretraining. Transformer architectures have facilitated building higher-capacity models and pretraining has made it possible to effectively utilize this capacity for a wide variety of tasks. \textit{Transformers} is an open-source library with the goal of opening up these advances to the wider machine learning community. The library consists of carefully engineered state-of-the art Transformer architectures under a unified API. Backing this library is a curated collection of pretrained models made by and available for the community. \textit{Transformers} is designed to be extensible by researchers, simple for practitioners, and fast and robust in industrial deployments. The library is available at \url{this https URL}.
Sam Shleifer
3 papers
Victor Sanh
4 papers
Julien Chaumond
3 papers
Clement Delangue
2 papers
Lysandre Debut
2 papers
Quentin Lhoest
2 papers
Thomas Wolf
1 papers
Anthony Moi
1 papers
Pierric Cistac
1 papers
Tim Rault
1 papers
Rémi Louf
1 papers
Morgan Funtowicz
1 papers
Joe Davison
1 papers
Clara Ma
1 papers
J. Plu
1 papers
Canwen Xu
1 papers
Sylvain Gugger
1 papers
Mariama Drame
1 papers
Alexander M. Rush
1 papers