中文

ruBert-base

Model was trained by SberDevices team.

  • Task: mask filling
  • Type: encoder
  • Tokenizer: bpe
  • Dict size: 120 138
  • Num Parameters: 178 M
  • Training Data Volume 30 GB

Authors