Bert Models for Biomedical Relation Extraction

Posted: 15 Oct 2024

See all articles by Murali Mohana Krishna Dandu

Murali Mohana Krishna Dandu

Independent Researcher

Vanitha Sivasankaran Balasubramaniam

Independent Researcher

A Renuka

Independent Researcher

Om Goel

Independent Researcher

Dr. Punit Goel

Govt. (PG) College Jaiharikhal Pauri Garhwal (Uttarakha)

Dr. Alok Gupta

Independent Researcher

Date Written: March 12, 2022

Abstract

Biomedical relation extraction is pivotal for advancing knowledge discovery and supporting decision-making in healthcare and research. Leveraging the capabilities of Bidirectional Encoder Representations from Transformers (BERT) has significantly enhanced the accuracy and efficiency of extracting complex relationships from vast biomedical literature. This study explores the application of BERT-based models in identifying and classifying relationships among biomedical entities such as genes, proteins, diseases, and drugs. By fine-tuning pre-trained BERT models on specialized biomedical corpora, the research addresses the nuanced linguistic patterns and domainspecific terminologies inherent in biomedical texts. Comparative analyses demonstrate that BERT models outperform traditional machine learning approaches and earlier deep learning frameworks in tasks like protein-protein interaction extraction, drug-disease association identification, and gene-disease relationship mapping. Additionally, the study investigates the integration of domain-adaptive pre-training and the incorporation of external knowledge bases to further enhance model performance. Challenges such as data scarcity, ambiguity in biomedical language, and the need for extensive computational resources are discussed, alongside strategies to mitigate these issues. The findings underscore the potential of BERT-based models to facilitate more accurate and scalable biomedical information extraction, thereby supporting the acceleration of biomedical research and the development of innovative healthcare solutions. Future directions include the exploration of more advanced transformer architectures, the expansion of annotated biomedical datasets, and the implementation of real-time relation extraction systems. This research contributes to the growing body of knowledge on natural language processing in the biomedical domain and highlights the transformative impact of BERT models on extracting meaningful relationships from complex biomedical data.

Keywords: BERT Models, Biomedical Relation Extraction, Natural Language Processing, Transformer Architectures, Gene-Disease Relationships, Protein-Protein Interactions, Drug-Disease Associations, Deep Learning in Biomedicine, Domain-Specific Fine-Tuning, Knowledge base Integration, Biomedical Text Mining, machine Learning in Healthcare, Relation Classification, Information Extraction

Suggested Citation

Dandu, Murali Mohana Krishna and Balasubramaniam, Vanitha Sivasankaran and Renuka, A and Goel, Om and Goel, Dr. Punit and Gupta, Dr. Alok,

Bert Models for Biomedical Relation Extraction

(March 12, 2022). Available at SSRN: https://ssrn.com/abstract=4985957

Vanitha Sivasankaran Balasubramaniam

Independent Researcher ( email )

A Renuka

Independent Researcher

Om Goel

Independent Researcher ( email )

Dr. Punit Goel

Govt. (PG) College Jaiharikhal Pauri Garhwal (Uttarakha)

Dr. Alok Gupta

Independent Researcher

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Abstract Views
37
PlumX Metrics