TY - JOUR
T1 - Transductive Linear Probing
T2 - 1st Learning on Graphs Conference, LOG 2022
AU - Tan, Zhen
AU - Wang, Song
AU - Ding, Kaize
AU - Li, Jundong
AU - Liu, Huan
N1 - Funding Information: This work is supported by the National Science Foundation under grants IIS-2006844, IIS-2144209, IIS-2223769, IIS-2229461, CNS-2154962, and BCS-2228534, the Army Research Office (ARO) W911NF2110030, the Office of Naval Research N00014-21-1-4002, the JP Morgan Chase Faculty Research Award, the Cisco Faculty Research Award, and the Jefferson Lab subcontract JSA-22-D0311. Publisher Copyright: © 2022 Proceedings of Machine Learning Research. All rights reserved.
PY - 2022
Y1 - 2022
N2 - Few-shot node classification is tasked to provide accurate predictions for nodes from novel classes with only few representative labeled nodes. This problem has drawn tremendous attention for its projection to prevailing real-world applications, such as product categorization for newly added commodity categories on an E-commerce platform with scarce records or diagnoses for rare diseases on a patient similarity graph. To tackle such challenging label scarcity issues in the non-Euclidean graph domain, meta-learning has become a successful and predominant paradigm. More recently, inspired by the development of graph self-supervised learning, transferring pretrained node embeddings for few-shot node classification could be a promising alternative to meta-learning but remains unexposed. In this work, we empirically demonstrate the potential of an alternative framework, Transductive Linear Probing, that transfers pretrained node embeddings, which are learned from graph contrastive learning methods. We further extend the setting of few-shot node classification from standard fully supervised to a more realistic self-supervised setting, where meta-learning methods cannot be easily deployed due to the shortage of supervision from training classes. Surprisingly, even without any ground-truth labels, transductive linear probing with self-supervised graph contrastive pretraining can outperform the state-of-the-art fully supervised meta-learning based methods under the same protocol. We hope this work can shed new light on few-shot node classification problems and foster future research on learning from scarcely labeled instances on graphs.
AB - Few-shot node classification is tasked to provide accurate predictions for nodes from novel classes with only few representative labeled nodes. This problem has drawn tremendous attention for its projection to prevailing real-world applications, such as product categorization for newly added commodity categories on an E-commerce platform with scarce records or diagnoses for rare diseases on a patient similarity graph. To tackle such challenging label scarcity issues in the non-Euclidean graph domain, meta-learning has become a successful and predominant paradigm. More recently, inspired by the development of graph self-supervised learning, transferring pretrained node embeddings for few-shot node classification could be a promising alternative to meta-learning but remains unexposed. In this work, we empirically demonstrate the potential of an alternative framework, Transductive Linear Probing, that transfers pretrained node embeddings, which are learned from graph contrastive learning methods. We further extend the setting of few-shot node classification from standard fully supervised to a more realistic self-supervised setting, where meta-learning methods cannot be easily deployed due to the shortage of supervision from training classes. Surprisingly, even without any ground-truth labels, transductive linear probing with self-supervised graph contrastive pretraining can outperform the state-of-the-art fully supervised meta-learning based methods under the same protocol. We hope this work can shed new light on few-shot node classification problems and foster future research on learning from scarcely labeled instances on graphs.
UR - http://www.scopus.com/inward/record.url?scp=85164534973&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85164534973&partnerID=8YFLogxK
M3 - Conference article
SN - 2640-3498
VL - 198
JO - Proceedings of Machine Learning Research
JF - Proceedings of Machine Learning Research
Y2 - 9 December 2022 through 12 December 2022
ER -