模型:
racai/distilbert-base-romanian-cased
这个存储库包含了大小写不敏感的罗马尼亚DistilBERT(在论文中称为Distil-BERT-base-ro)。用于蒸馏的教师模型是: dumitrescustefan/bert-base-romanian-cased-v1 .
该模型在 this paper 中被介绍。相邻的代码可以在 here 中找到。
from transformers import AutoTokenizer, AutoModel # load the tokenizer and the model tokenizer = AutoTokenizer.from_pretrained("racai/distilbert-base-romanian-cased") model = AutoModel.from_pretrained("racai/distilbert-base-romanian-cased") # tokenize a test sentence input_ids = tokenizer.encode("Aceasta este o propoziție de test.", add_special_tokens=True, return_tensors="pt") # run the tokens trough the model outputs = model(input_ids) print(outputs)
它比其教师模型bert-base-romanian-cased-v1小35%。
Model | Size (MB) | Params (Millions) |
---|---|---|
bert-base-romanian-cased-v1 | 477 | 124 |
distilbert-base-romanian-cased | 312 | 81 |
我们对该模型与其教师模型在5个罗马尼亚任务上进行了评估:
Model | UPOS | XPOS | NER | SAPN | SAR | DI | STS |
---|---|---|---|---|---|---|---|
bert-base-romanian-cased-v1 | 98.00 | 96.46 | 85.88 | 98.07 | 79.61 | 95.58 | 80.30 |
distilbert-base-romanian-cased | 97.97 | 97.08 | 83.35 | 98.20 | 80.51 | 96.31 | 80.57 |
@article{avram2021distilling, title={Distilling the Knowledge of Romanian BERTs Using Multiple Teachers}, author={Andrei-Marius Avram and Darius Catrina and Dumitru-Clementin Cercel and Mihai Dascălu and Traian Rebedea and Vasile Păiş and Dan Tufiş}, journal={ArXiv}, year={2021}, volume={abs/2112.12650} }