ATM-TCR: TCR-Epitope Binding Affinity Prediction Using a Multi-Head Self-Attention Model

Michael Cai, Seojin Bang, Pengfei Zhang, Heewook Lee

Research output: Contribution to journalArticlepeer-review

17 Scopus citations

Abstract

TCR-epitope pair binding is the key component for T cell regulation. The ability to predict whether a given pair binds is fundamental to understanding the underlying biology of the binding mechanism as well as developing T-cell mediated immunotherapy approaches. The advent of large-scale public databases containing TCR-epitope binding pairs enabled the recent development of computational prediction methods for TCR-epitope binding. However, the number of epitopes reported along with binding TCRs is far too small, resulting in poor out-of-sample performance for unseen epitopes. In order to address this issue, we present our model ATM-TCR which uses a multi-head self-attention mechanism to capture biological contextual information and improve generalization performance. Additionally, we present a novel application of the attention map from our model to improve out-of-sample performance by demonstrating on recent SARS-CoV-2 data.

Original languageEnglish (US)
Article number893247
JournalFrontiers in immunology
Volume13
DOIs
StatePublished - Jul 6 2022

Keywords

  • TCR
  • adaptive immunotherapy
  • antigen
  • binding affinity
  • epitope
  • machine learning
  • self-attention model

ASJC Scopus subject areas

  • Immunology and Allergy
  • Immunology

Fingerprint

Dive into the research topics of 'ATM-TCR: TCR-Epitope Binding Affinity Prediction Using a Multi-Head Self-Attention Model'. Together they form a unique fingerprint.

Cite this