模型:
echarlaix/t5-small-onnx
T5 是一个编码器-解码器模型,它在无监督和有监督任务的多任务混合上进行了预训练,每个任务都被转换为文本到文本的格式。
更多信息,请参阅原始论文。
论文: Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
作者:Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang, Michael Matena, Yanqi Zhou, Wei Li, Peter J. Liu
您可以使用Transformers pipeline与此模型进行配合使用。
from transformers import AutoTokenizer, pipeline from optimum.onnxruntime import ORTModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("echarlaix/t5-small-onnx") model = ORTModelForSeq2SeqLM.from_pretrained("echarlaix/t5-small-onnx") translator = pipeline("translation_en_to_fr", model=model, tokenizer=tokenizer) results = translator("My name is Eustache and I have a pet raccoon") print(results)