Home > AI

PyTorch实现的transformers

linker   ·   发表于 2019-11-20   ·   AI

链接=>/huggingface/transformers

Transformers (formerly known as pytorch-transformers and pytorch-pretrained-bert) provides state-of-the-art general-purpose architectures (BERT, GPT-2, RoBERTa, XLM, DistilBert, XLNet, CTRL...) for Natural Language Understanding (NLU) and Natural Language Generation (NLG) with over 32+ pretrained models in 100+ languages and deep interoperability between TensorFlow 2.0 and PyTorch.

Transformers(以前称为pytorch Transformers和pytorch pretrained bert)为自然语言理解(NLU)和自然语言生成(NLG)提供了最先进的通用架构(bert、GPT-2、RoBERTa、XLM、DistilBert、XLNet、CTRL…),其中有超过32个100多种语言的预训练模型和深度互操作性在TensorFlow 2.0和Pythorch之间。


0 Reply   |  Until 2019-11-20 | 36 View
LoginCan Publish Content