模型:
AdapterHub/roberta-base-pf-hotpotqa
An adapter for the roberta-base model that was trained on the hotpot_qa dataset and includes a prediction head for question answering.
This adapter was created for usage with the adapter-transformers library.
First, install adapter-transformers :
pip install -U adapter-transformers
Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More
Now, the adapter can be loaded and activated like this:
from transformers import AutoModelWithHeads model = AutoModelWithHeads.from_pretrained("roberta-base") adapter_name = model.load_adapter("AdapterHub/roberta-base-pf-hotpotqa", source="hf") model.active_adapters = adapter_name
The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer . In particular, training configurations for all tasks can be found here .
Refer to the paper for more information on results.
If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection" :
@inproceedings{poth-etal-2021-pre, title = "{W}hat to Pre-Train on? {E}fficient Intermediate Task Selection", author = {Poth, Clifton and Pfeiffer, Jonas and R{"u}ckl{'e}, Andreas and Gurevych, Iryna}, booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing", month = nov, year = "2021", address = "Online and Punta Cana, Dominican Republic", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2021.emnlp-main.827", pages = "10585--10605", }