英文

t5-small

模型描述

T5 是一个编码器-解码器模型,在无监督和监督任务的多任务混合下进行预训练,并且每个任务都被转换为文本到文本的格式。

更多信息,请参阅原始论文。

论文: Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer

作者: Colin Raffel,Noam Shazeer,Adam Roberts,Katherine Lee,Sharan Narang,Michael Matena,Yanqi Zhou,Wei Li,Peter J.Liu

使用示例

您可以使用此模型与Transformers pipeline一起使用。

from transformers import AutoTokenizer, pipeline
from optimum.onnxruntime import ORTModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("echarlaix/t5-small-dynamic")
model = ORTModelForSeq2SeqLM.from_pretrained("echarlaix/t5-small-dynamic")
translator = pipeline("translation_en_to_fr", model=model, tokenizer=tokenizer)
text = "He never went out without a book under his arm, and he often came back with two."
results = translator(text)
print(results)