中文

ruSciBERT

Model was trained by Sber AI team and MLSA Lab of Institute for AI, MSU. If you use our model for your project, please tell us about it ( nikgerasimenko@gmail.com ).

Presentation at the AI Journey 2022

  • Task: mask filling
  • Type: encoder
  • Tokenizer: bpe
  • Dict size: 50265
  • Num Parameters: 123 M
  • Training Data Volume: 6.5 GB