A pre-training and self-training approach for biomedical named entity recognition

Shang Gao, Olivera Kotevska, Alexandre Sorokine, J. Blair Christian

Research output: Contribution to journalArticlepeer-review

31 Scopus citations

Abstract

Named entity recognition (NER) is a key component of many scientific literature mining tasks, such as information retrieval, information extraction, and question answering; however, many modern approaches require large amounts of labeled training data in order to be effective. This severely limits the effectiveness of NER models in applications where expert annotations are difficult and expensive to obtain. In this work, we explore the effectiveness of transfer learning and semi-supervised self-training to improve the performance of NER models in biomedical settings with very limited labeled data (250-2000 labeled samples). We first pre-train a BiLSTM-CRF and a BERT model on a very large general biomedical NER corpus such as MedMentions or Semantic Medline, and then we fine-tune the model on a more specific target NER task that has very limited training data; finally, we apply semisupervised self-training using unlabeled data to further boost model performance. We show that in NER tasks that focus on common biomedical entity types such as those in the Unified Medical Language System (UMLS), combining transfer learning with self-training enables a NER model such as a BiLSTM-CRF or BERT to obtain similar performance with the same model trained on 3x-8x the amount of labeled data. We further show that our approach can also boost performance in a low-resource application where entities types are more rare and not specifically covered in UMLS.

Original languageEnglish
Article numbere0246310
JournalPLoS ONE
Volume16
Issue number2 February
DOIs
StatePublished - Feb 2021

Funding

Blair Christian (BC) at Oak Ridge National Laboratory received funding from the Department of Energy (energy.gov). This funding was provided through the Laboratory Directed Research and Development (LDRD) program of Oak Ridge National Laboratory, under LDRD project No. 9494. These funds were used to facilitate this study and support of salaries for SG, OK, AS, and BC. This manuscript has been authored by UT-Battelle, LLC under Contract No. DE-AC05-00OR22725 with the U.S. Department of Energy. The United States Government retains and the publisher, by accepting the article for publication, acknowledges that the United States Government retains a nonexclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of the manuscript, or allow others to do so, for United States Government purposes. The Department of Energy will provide public access to these results of federally sponsored research in accordance with the DOE Public Access Plan (http://energy.gov/ downloads/doe-public-access-plan). This research used resources of the Oak Ridge Leadership Computing Facility at the Oak Ridge National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC05-00OR22725. This research used resources of the Compute and Data Environment for Science (CADES) at the Oak Ridge National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC05-00OR22725. The funding offices from the Office of Science and DOE did not have any additional role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript. The specific roles of all authors are articulated in the 'author contributions' section.

FundersFunder number
CADES
DOE Public Access Plan
Data Environment for Science
United States Government
U.S. Department of Energy
Office of Science
Oak Ridge National Laboratory
Laboratory Directed Research and DevelopmentDE-AC05-00OR22725, 9494

    Fingerprint

    Dive into the research topics of 'A pre-training and self-training approach for biomedical named entity recognition'. Together they form a unique fingerprint.

    Cite this