模型:
l3cube-pune/marathi-gpt
MahaGPT is a Marathi GPT2 model. It is a GPT2 model pre-trained on L3Cube-MahaCorpus and other publicly available Marathi monolingual datasets. [dataset link] ( https://github.com/l3cube-pune/MarathiNLP )
More details on the dataset, models, and baseline results can be found in our [paper] ( https://arxiv.org/abs/2202.01159 )
@article{joshi2022l3cube, title={L3Cube-MahaCorpus and MahaBERT: Marathi Monolingual Corpus, Marathi BERT Language Models, and Resources}, author={Joshi, Raviraj}, journal={arXiv preprint arXiv:2202.01159}, year={2022} }