中文

Spanish GPT-2

GPT-2 model trained from scratch on the Spanish portion of OSCAR . The model is trained with Flax and using TPUs sponsored by Google since this is part of the Flax/Jax Community Week organised by HuggingFace.

Model description

The model used for training is OpenAI's GPT-2 , introduced in the paper "Language Models are Unsupervised Multitask Learners" by Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei and Ilya Sutskever.

This model is available in the ? Model Hub .

Training data

Spanish portion of OSCAR or O pen S uper-large C rawled A LMAnaCH co R pus, a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture.

This corpus is available in the ? Datasets library.

Team members