模型:

ai-forever/rugpt2large

中文

rugpt2large

Model was trained with sequence length 1024 using transformers by SberDevices team on 170Gb data on 64 GPUs 3 weeks.