TY - GEN
T1 - Be more with less
T2 - 2020 Conference on Empirical Methods in Natural Language Processing, EMNLP 2020
AU - Ding, Kaize
AU - Wang, Jianling
AU - Li, Jundong
AU - Li, Dingcheng
AU - Liu, Huan
N1 - Funding Information: This material is in part supported by the National Science Foundation (NSF) grant 1614576. Publisher Copyright: © 2020 Association for Computational Linguistics
PY - 2020
Y1 - 2020
N2 - Text classification is a critical research topic with broad applications in natural language processing. Recently, graph neural networks (GNNs) have received increasing attention in the research community and demonstrated their promising results on this canonical task. Despite the success, their performance could be largely jeopardized in practice since they are: (1) unable to capture high-order interaction between words; (2) inefficient to handle large datasets and new documents. To address those issues, in this paper, we propose a principled model - hypergraph attention networks (HyperGAT), which can obtain more expressive power with less computational consumption for text representation learning. Extensive experiments on various benchmark datasets demonstrate the efficacy of the proposed approach on the text classification task.
AB - Text classification is a critical research topic with broad applications in natural language processing. Recently, graph neural networks (GNNs) have received increasing attention in the research community and demonstrated their promising results on this canonical task. Despite the success, their performance could be largely jeopardized in practice since they are: (1) unable to capture high-order interaction between words; (2) inefficient to handle large datasets and new documents. To address those issues, in this paper, we propose a principled model - hypergraph attention networks (HyperGAT), which can obtain more expressive power with less computational consumption for text representation learning. Extensive experiments on various benchmark datasets demonstrate the efficacy of the proposed approach on the text classification task.
UR - http://www.scopus.com/inward/record.url?scp=85115716964&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85115716964&partnerID=8YFLogxK
M3 - Conference contribution
T3 - EMNLP 2020 - 2020 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference
SP - 4927
EP - 4936
BT - EMNLP 2020 - 2020 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference
PB - Association for Computational Linguistics (ACL)
Y2 - 16 November 2020 through 20 November 2020
ER -