模型:
RUCAIBox/elmer
ELMER模型是由Junyi Li,Tianyi Tang,Wayne Xin Zhao,Jian-Yun Nie和Ji-Rong Wen于 ELMER: A Non-Autoregressive Pre-trained Language Model for Efficient and Effective Text Generation 年提出的。
详细信息和说明可以在 https://github.com/RUCAIBox/ELMER 找到。
ELMER是一种用于非自回归文本生成的高效有效的PLM模型,通过利用提前终止技术在不同层次生成标记
ELMER的架构是标准Transformer编码器-解码器的一个变体,提出了三个技术贡献:
要在非自回归文本生成上对ELMER进行微调:
>>> from transformers import BartTokenizer as ElmerTokenizer >>> from transformers import BartForConditionalGeneration as ElmerForConditionalGeneration >>> tokenizer = ElmerTokenizer.from_pretrained("RUCAIBox/elmer") >>> model = ElmerForConditionalGeneration.from_pretrained("RUCAIBox/elmer")
@article{lijunyi2022elmer, title={ELMER: A Non-Autoregressive Pre-trained Language Model for Efficient and Effective Text Generation}, author={Li, Junyi and Tang, Tianyi and Zhao, Wayne Xin and Nie, Jian-Yun and Wen, Ji-Rong}, booktitle={EMNLP 2022}, year={2022} }