模型:

racai/distilbert-base-romanian-cased

英文

罗马尼亚DistilBERT

这个存储库包含了大小写不敏感的罗马尼亚DistilBERT(在论文中称为Distil-BERT-base-ro)。用于蒸馏的教师模型是: dumitrescustefan/bert-base-romanian-cased-v1 .

该模型在 this paper 中被介绍。相邻的代码可以在 here 中找到。

用法

from transformers import AutoTokenizer, AutoModel

# load the tokenizer and the model
tokenizer = AutoTokenizer.from_pretrained("racai/distilbert-base-romanian-cased")
model = AutoModel.from_pretrained("racai/distilbert-base-romanian-cased")

# tokenize a test sentence
input_ids = tokenizer.encode("Aceasta este o propoziție de test.", add_special_tokens=True, return_tensors="pt")

# run the tokens trough the model
outputs = model(input_ids)

print(outputs)

模型大小

它比其教师模型bert-base-romanian-cased-v1小35%。

Model Size (MB) Params (Millions)
bert-base-romanian-cased-v1 477 124
distilbert-base-romanian-cased 312 81

评估

我们对该模型与其教师模型在5个罗马尼亚任务上进行了评估:

  • UPOS: 通用词性(F1-macro)
  • XPOS: 扩展词性(F1-macro)
  • NER: 命名实体识别(F1-macro)
  • SAPN: 情感分析 - 正面 vs 负面(准确度)
  • SAR: 情感分析 - 评级(F1-macro)
  • DI: 方言鉴别(F1-macro)
  • STS: 语义文本相似度(皮尔逊相关系数)
Model UPOS XPOS NER SAPN SAR DI STS
bert-base-romanian-cased-v1 98.00 96.46 85.88 98.07 79.61 95.58 80.30
distilbert-base-romanian-cased 97.97 97.08 83.35 98.20 80.51 96.31 80.57

BibTeX条目和引用信息

@article{avram2021distilling,
  title={Distilling the Knowledge of Romanian BERTs Using Multiple Teachers},
  author={Andrei-Marius Avram and Darius Catrina and Dumitru-Clementin Cercel and Mihai Dascălu and Traian Rebedea and Vasile Păiş and Dan Tufiş},
  journal={ArXiv},
  year={2021},
  volume={abs/2112.12650}
}