英文

介绍

此存储库提供了一个在较低硬件配置下使用T5进行PT-EN翻译任务的实现。我们对分词器和后处理进行了一些改进,从而提高了结果,并使用了葡萄牙语的预训练模型进行翻译。您可以在 our repository 收集更多信息。还可以查看 our paper

使用方法

只需按照“在Transformers中使用”说明进行操作。在向T5定义任务之前,需要添加一些词汇。

您还可以为其创建一个管道。一个示例是使用短语“我喜欢吃米饭”:

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM, pipeline
  
tokenizer = AutoTokenizer.from_pretrained("unicamp-dl/translation-pt-en-t5")

model = AutoModelForSeq2SeqLM.from_pretrained("unicamp-dl/translation-pt-en-t5")

pten_pipeline = pipeline('text2text-generation', model=model, tokenizer=tokenizer)

pten_pipeline("translate Portuguese to English: Eu gosto de comer arroz.")

引用

@inproceedings{lopes-etal-2020-lite,
    title = "Lite Training Strategies for {P}ortuguese-{E}nglish and {E}nglish-{P}ortuguese Translation",
    author = "Lopes, Alexandre  and
      Nogueira, Rodrigo  and
      Lotufo, Roberto  and
      Pedrini, Helio",
    booktitle = "Proceedings of the Fifth Conference on Machine Translation",
    month = nov,
    year = "2020",
    address = "Online",
    publisher = "Association for Computational Linguistics",
    url = "https://www.aclweb.org/anthology/2020.wmt-1.90",
    pages = "833--840",
}