Abstract

Linting Xue, Noah Constant, Adam Roberts, Mihir Kale, Rami Al-Rfou, Aditya Siddhant, Aditya Barua, Colin Raffel. Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 2021.

Keywords

TransformerComputer scienceMassively parallelComputational linguisticsAssociation (psychology)Artificial intelligenceNatural language processingElectrical engineeringEngineeringParallel computingPhilosophyEpistemology

Affiliated Institutions

Related Publications

Jacob Devlin, Ming-Wei Chang, Kenton Lee, Kristina Toutanova. Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: ...

2019 30394 citations

A Tool for Reviewers

Peer review lies at the core of science and academic life. In one of its most pervasive forms, peer review for the scientific literature is the main mechanism that research jour...

2001 Academic Medicine 27 citations

Publication Info

Year
2021
Type
article
Citations
1428
Access
Closed

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

1428
OpenAlex
293
Influential
629
CrossRef

Cite This

Linting Xue, Noah Constant, Adam P. Roberts et al. (2021). mT5: A Massively Multilingual Pre-trained Text-to-Text Transformer. Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies . https://doi.org/10.18653/v1/2021.naacl-main.41

Identifiers

DOI
10.18653/v1/2021.naacl-main.41
arXiv
2010.11934

Data Quality

Data completeness: 84%