在生物医学领域,预训练语言模型引起了越来越多的关注,受到其在一般自然语言领域的巨大成功的启发。在一般语言领域的两个主要预训练语言模型分支中,即BERT(及其变种)和GPT(及其变种),前者在生物医学领域得到了广泛研究,例如BioBERT和PubMedBERT。虽然它们在各种判别性生物医学后续任务上取得了巨大成功,但生成能力的缺失限制了它们的应用范围。在本文中,我们提出了BioGPT,这是一个在大规模生物医学文献上进行预训练的特定领域生成Transformer语言模型。我们评估了BioGPT在六个生物医学自然语言处理任务上的表现,并证明我们的模型在大多数任务上优于先前的模型。特别地,在BC5CDR、KD-DTI和DDI的端到端关系提取任务上,我们分别获得了44.98%、38.42%和40.76%的F1得分,并在PubMedQA上获得了78.2%的准确率,创造了新纪录。我们对文本生成的案例研究进一步证明了BioGPT在生物医学文献中为生物医学术语生成流畅描述的优势。
如果你在研究中发现BioGPT对你有用,请引用以下论文:
@article{10.1093/bib/bbac409, author = {Luo, Renqian and Sun, Liai and Xia, Yingce and Qin, Tao and Zhang, Sheng and Poon, Hoifung and Liu, Tie-Yan}, title = "{BioGPT: generative pre-trained transformer for biomedical text generation and mining}", journal = {Briefings in Bioinformatics}, volume = {23}, number = {6}, year = {2022}, month = {09}, abstract = "{Pre-trained language models have attracted increasing attention in the biomedical domain, inspired by their great success in the general natural language domain. Among the two main branches of pre-trained language models in the general language domain, i.e. BERT (and its variants) and GPT (and its variants), the first one has been extensively studied in the biomedical domain, such as BioBERT and PubMedBERT. While they have achieved great success on a variety of discriminative downstream biomedical tasks, the lack of generation ability constrains their application scope. In this paper, we propose BioGPT, a domain-specific generative Transformer language model pre-trained on large-scale biomedical literature. We evaluate BioGPT on six biomedical natural language processing tasks and demonstrate that our model outperforms previous models on most tasks. Especially, we get 44.98\%, 38.42\% and 40.76\% F1 score on BC5CDR, KD-DTI and DDI end-to-end relation extraction tasks, respectively, and 78.2\% accuracy on PubMedQA, creating a new record. Our case study on text generation further demonstrates the advantage of BioGPT on biomedical literature to generate fluent descriptions for biomedical terms.}", issn = {1477-4054}, doi = {10.1093/bib/bbac409}, url = {https://doi.org/10.1093/bib/bbac409}, note = {bbac409}, eprint = {https://academic.oup.com/bib/article-pdf/23/6/bbac409/47144271/bbac409.pdf}, }