模型:
EMBEDDIA/crosloengual-bert
CroSloEngual BERT是一个三语模型,使用bert-base架构,在克罗地亚、斯洛文尼亚和英语语料库上进行训练。相比于 multilingual BERT ,该模型在这三种语言上表现更好,同时还提供了跨语言知识传递的选项,这是单语模型所没有的。
我们在文章中提供了评估结果:
@Inproceedings{ulcar-robnik2020finest, author = "Ulčar, M. and Robnik-Šikonja, M.", year = 2020, title = "{FinEst BERT} and {CroSloEngual BERT}: less is more in multilingual models", editor = "Sojka, P and Kopeček, I and Pala, K and Horák, A", booktitle = "Text, Speech, and Dialogue {TSD 2020}", series = "Lecture Notes in Computer Science", volume = 12284, publisher = "Springer", url = "https://doi.org/10.1007/978-3-030-58323-1_11", }
该预印本可以在 arxiv.org/abs/2006.07890 上找到。