中文

ruT5-large

Model was trained by SberDevices .

  • Task: text2text generation
  • Type: encoder-decoder
  • Tokenizer: bpe
  • Dict size: 32 101
  • Num Parameters: 737 M
  • Training Data Volume 300 GB

Authors