What's the relationship between deep language models (e.g., BERT, GPT-2, GPT-3) and knowledge graphs? Can we use the pre-trained deep language models to construct knowledge graphs? We find that we can construct knowledge graphs from the pre-trained language models. The generated knowledge graphs not only cover the knowledge already in existing knowledge graphs, such as Wikidata, but also features open factual knowledge that is new.
I was a research scientist in Amazon AI and IBM Research-Almaden. I received Ph.D. degree from Peking University advised by Dr. Ming Zhang. I was a visiting Ph.D. student at UIUC advised by Dr. Jiawei Han.
- Oct, 2020: “Language Models are Open Knowledge Graphs” is on arXiv [paper]. Our code and knowledge graphs will be made publicly available. Stay tuned!
- Sep, 2020: “PoD: Positional Dependency-Based Word Embedding for Aspect Term Extraction” is accepted by COLING 2020 [paper]
- Jul, 2020: I will serve as PC for ICLR 2021.
- Feb, 2020: I will serve as PC for ACL 2020.
- Feb, 2020: “GluonCV and GluonNLP: deep learning in computer vision and natural language processing” is accepted by JMLR [paper] [GluonNLP code] [GluonCV code].
My research interests span the areas of NLP, ML Systems, and Security. The goal of my research is to enable real-world applications with secure artificial general intelligence. To achieve this goal, I have been working on the intersection of deep language understanding, AI systems, and computer security and privacy.
Selected Publications [Full List]
Gets more than 4.4k blog views and more than 320 Likes and Retweets on Twitter. Experimental results on the PTB, WikiText-2, and WikiText-103 show that proposed method achieves perplexities between 20.42 and 34.11 on all problems, i.e. on average an improvement of 12.0 perplexity units compared to state-of-the-art LSTMs.
Crowd-in-the-loop: A hybrid approach for annotating semantic roles
Chenguang Wang, Alan Akbik, Laura Chiticariu, Yunyao Li, Fei Xia, and Anbang Xu.
In Proc. 2017 Conf. on Empirical Methods on Natural Language Processing (EMNLP 2017).
[paper] [data] [slides]
Our experimental evaluation shows that the proposed approach reduces the workload for experts by over two-thirds, and thus significantly reduces the cost of producing SRL annotation at little loss in quality.
Text classification with heterogeneous information network kernels
Chenguang Wang, Yangqiu Song, Haoran Li, Ming Zhang and Jiawei Han.
In Proc. 2016 AAAI Conf. on Artificial Intelligence (AAAI 2016).
[paper] [slides] [code] [data]
This paper presents a novel text as network classification framework, which introduces a structured and typed heterogeneous information networks (HINs) representation of texts, and a meta-path based approach to link texts.
Incorporating world knowledge to document clustering via heterogeneous information networks
Chenguang Wang, Yangqiu Song, Ahmed El-Kishky, Dan Roth, Ming Zhang, and Jiawei Han.
In Proc. 2015 ACM SIGKDD Int. Conf. on Knowledge Discovery and Data Mining (KDD 2015).
[paper] [slides] [video] [code] [data]
We provide three ways to specify the world knowledge to domains by resolving the ambiguity of the entities and their types, and represent the data with world knowledge as a heterogeneous information network.