Chemical-Protein Relation Extraction with Pre-trained Prompt Tuning

Proc (IEEE Int Conf Healthc Inform). 2022 Jun:2022:608-609. doi: 10.1109/ichi54592.2022.00120. Epub 2022 Sep 8.

Abstract

Biomedical relation extraction plays a critical role in the construction of high-quality knowledge graphs and databases, which can further support many downstream applications. Pre-trained prompt tuning, as a new paradigm, has shown great potential in many natural language processing (NLP) tasks. Through inserting a piece of text into the original input, prompt converts NLP tasks into masked language problems, which could be better addressed by pre-trained language models (PLMs). In this study, we applied pre-trained prompt tuning to chemical-protein relation extraction using the BioCreative VI CHEMPROT dataset. The experiment results showed that the pre-trained prompt tuning outperformed the baseline approach in chemical-protein interaction classification. We conclude that the prompt tuning can improve the efficiency of the PLMs on chemical-protein relation extraction tasks.

Keywords: biomedical relation extraction; chemical-protein relation; pre-trained prompt tuning.