模型:
facebook/mbart-large-50-many-to-one-mmt
这个模型是 mBART-large-50 的一个经过微调的检查点。mbart-large-50-many-to-many-mmt是针对多语言机器翻译进行微调的模型。它在 Multilingual Translation with Extensible Multilingual Pretraining and Finetuning 论文中被介绍。该模型可以直接在任意一对50种语言之间进行翻译。
from transformers import MBartForConditionalGeneration, MBart50TokenizerFast article_hi = "संयुक्त राष्ट्र के प्रमुख का कहना है कि सीरिया में कोई सैन्य समाधान नहीं है" article_ar = "الأمين العام للأمم المتحدة يقول إنه لا يوجد حل عسكري في سوريا." model = MBartForConditionalGeneration.from_pretrained("facebook/mbart-large-50-many-to-one-mmt") tokenizer = MBart50TokenizerFast.from_pretrained("facebook/mbart-large-50-many-to-one-mmt") # translate Hindi to English tokenizer.src_lang = "hi_IN" encoded_hi = tokenizer(article_hi, return_tensors="pt") generated_tokens = model.generate(**encoded_hi) tokenizer.batch_decode(generated_tokens, skip_special_tokens=True) # => "The head of the UN says there is no military solution in Syria." # translate Arabic to English tokenizer.src_lang = "ar_AR" encoded_ar = tokenizer(article_ar, return_tensors="pt") generated_tokens = model.generate(**encoded_ar) tokenizer.batch_decode(generated_tokens, skip_special_tokens=True) # => "The Secretary-General of the United Nations says there is no military solution in Syria."
请参考 model hub 以查找更多经过微调的版本。
阿拉伯语(ar_AR),捷克语(cs_CZ),德语(de_DE),英语(en_XX),西班牙语(es_XX),爱沙尼亚语(et_EE),芬兰语(fi_FI),法语(fr_XX),古吉拉特语(gu_IN),印地语(hi_IN),意大利语(it_IT),日语(ja_XX),哈萨克语(kk_KZ),韩语(ko_KR),立陶宛语(lt_LT),拉脱维亚语(lv_LV),缅甸语(my_MM),尼泊尔语(ne_NP),荷兰语(nl_XX),罗马尼亚语(ro_RO),俄语(ru_RU),僧伽罗语(si_LK),土耳其语(tr_TR),越南语(vi_VN),中文(zh_CN),南非荷兰语(af_ZA),阿塞拜疆语(az_AZ),孟加拉语(bn_IN),波斯语(fa_IR),希伯来语(he_IL),克罗地亚语(hr_HR),印度尼西亚语(id_ID),格鲁吉亚语(ka_GE),高棉语(km_KH),马其顿语(mk_MK),马拉雅拉姆语(ml_IN),蒙古语(mn_MN),马拉地语(mr_IN),波兰语(pl_PL),普什图语(ps_AF),葡萄牙语(pt_XX),瑞典语(sv_SE),斯瓦希里语(sw_KE),泰米尔语(ta_IN),泰卢固语(te_IN),泰语(th_TH),塔加洛语(tl_XX),乌克兰语(uk_UA),乌尔都语(ur_PK),科萨语(xh_ZA),加利西亚语(gl_ES),斯洛文语(sl_SI)
@article{tang2020multilingual, title={Multilingual Translation with Extensible Multilingual Pretraining and Finetuning}, author={Yuqing Tang and Chau Tran and Xian Li and Peng-Jen Chen and Naman Goyal and Vishrav Chaudhary and Jiatao Gu and Angela Fan}, year={2020}, eprint={2008.00401}, archivePrefix={arXiv}, primaryClass={cs.CL} }