A BERT-based one-pass multi-task model for clinical temporal relation extraction

  • Chen Lin
  • , Timothy Miller
  • , Dmitriy Dligach
  • , Farig Sadeque
  • , Steven Bethard
  • , Guergana Savova

Research output: Chapter in Book/Report/Conference proceedingConference contribution

23 Scopus citations

Abstract

Recently BERT has achieved a state-of-theart performance in temporal relation extraction from clinical Electronic Medical Records text. However, the current approach is inefficient as it requires multiple passes through each input sequence. We extend a recently-proposed one-pass model for relation classification to a one-pass model for relation extraction. We augment this framework by introducing global embeddings to help with long-distance relation inference, and by multi-task learning to increase model performance and generalizability.

Original languageEnglish (US)
Title of host publicationBioNLP 2020 - 19th SIGBioMed Workshop on Biomedical Language Processing, Proceedings of the Workshop
PublisherAssociation for Computational Linguistics (ACL)
Pages70-75
Number of pages6
ISBN (Electronic)9781952148095
StatePublished - 2020
Event19th SIGBioMed Workshop on Biomedical Language Processing, BioNLP 2020 at the 58th Annual Meeting of the Association for Computational Linguistics, ACL 2020 - Virtual, Online, United States
Duration: Jul 9 2020 → …

Publication series

NameProceedings of the Annual Meeting of the Association for Computational Linguistics

Conference

Conference19th SIGBioMed Workshop on Biomedical Language Processing, BioNLP 2020 at the 58th Annual Meeting of the Association for Computational Linguistics, ACL 2020
Country/TerritoryUnited States
CityVirtual, Online
Period7/9/20 → …

ASJC Scopus subject areas

  • Computer Science Applications
  • Linguistics and Language
  • Language and Linguistics

Fingerprint

Dive into the research topics of 'A BERT-based one-pass multi-task model for clinical temporal relation extraction'. Together they form a unique fingerprint.

Cite this