模型:

Geotrend/distilbert-base-es-cased

中文

distilbert-base-es-cased

We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.

Our versions give exactly the same representations produced by the original model which preserves the original accuracy.

For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT .

How to use

from transformers import AutoTokenizer, AutoModel

tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-es-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-es-cased")

To generate other smaller versions of multilingual transformers please visit our Github repo .

How to cite

@inproceedings{smallermdistilbert,
  title={Load What You Need: Smaller Versions of Mutlilingual BERT},
  author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
  booktitle={SustaiNLP / EMNLP},
  year={2020}
}

Contact

Please contact amine@geotrend.fr for any question, feedback or request.