模型:
KoboldAI/OPT-2.7B-Nerys-v2
OPT 2.7B-Nerys是使用Facebook的OPT模型进行微调创建的。
训练数据包含约2500本各种流派的电子书(“Pike”数据集),一个CYS的CYOA数据集和50本亚洲“轻小说”(“Manga-v1”数据集)。大部分数据集都以以下文本作为开头:[流派:<genre1>,<genre2>]该数据集已以与fairseq-dense-13B-Nerys-v2相同的方式进行清理。
您可以直接使用此模型进行文本生成的管道。此示例每次运行时会生成不同的序列:
>>> from transformers import pipeline >>> generator = pipeline('text-generation', model='KoboldAI/OPT-2.7B-Nerys-v2') >>> generator("Welcome Captain Janeway, I apologize for the delay.", do_sample=True, min_length=50) [{'generated_text': 'Welcome Captain Janeway, I apologize for the delay."\nIt's all right," Janeway said. "I'm certain that you're doing your best to keep me informed of what\'s going on."'}]
根据NLP技术已知的问题,潜在的相关因素包括偏见(性别、职业、种族和宗教)。
OPT-6B受OPT-175B许可证许可,版权所有(c)Meta Platforms, Inc.保留所有权利。
@misc{zhang2022opt, title={OPT: Open Pre-trained Transformer Language Models}, author={Susan Zhang and Stephen Roller and Naman Goyal and Mikel Artetxe and Moya Chen and Shuohui Chen and Christopher Dewan and Mona Diab and Xian Li and Xi Victoria Lin and Todor Mihaylov and Myle Ott and Sam Shleifer and Kurt Shuster and Daniel Simig and Punit Singh Koura and Anjali Sridhar and Tianlu Wang and Luke Zettlemoyer}, year={2022}, eprint={2205.01068}, archivePrefix={arXiv}, primaryClass={cs.CL} }