模型:
l3cube-pune/marathi-bert
新版本的此模型可以在这里找到: https://huggingface.co/l3cube-pune/marathi-bert-v2
MahaBERT 是一个马拉地语的BERT模型。它是在L3Cube-MahaCorpus和其他公开可用的马拉地语单语数据集上微调的多语种BERT(bert-base-multilingual-cased)模型。[数据集链接]( https://github.com/l3cube-pune/MarathiNLP )
有关数据集、模型和基准结果的更多详细信息可以在我们的[论文]( https://arxiv.org/abs/2202.01159 )中找到
@InProceedings{joshi:2022:WILDRE6, author = {Joshi, Raviraj}, title = {L3Cube-MahaCorpus and MahaBERT: Marathi Monolingual Corpus, Marathi BERT Language Models, and Resources}, booktitle = {Proceedings of The WILDRE-6 Workshop within the 13th Language Resources and Evaluation Conference}, month = {June}, year = {2022}, address = {Marseille, France}, publisher = {European Language Resources Association}, pages = {97--101} }