Abstract: Reasoning, the ability to logically draw conclusions from existing knowledge, has been long pursued as a goal of artificial intelligence. Although numerous learning algorithms have been developed for reasoning, most of them are limited to the domain they are trained on. By contrast, humans often derive high-level rules or principles from experience and apply them to new domains — an ability referred as inductive generalization. In this talk, we present a series of works that learn inductive representations for reasoning over knowledge graphs. First, we introduce Neural Bellman-Ford Networks (NBFNet) that captures paths between entities and can generalize to graphs of new entities. Then we discuss Graph Neural Network Query Executor (GNN-QE), an extension of NBNet that answers multi-hop logical queries and generalizes well on our inductive benchmark. Finally, by learning inductive representations for both entities and relations, we demonstrate that a model can generalize to any graph with arbitrary entity and relation vocabularies, paving the way for foundation models for knowledge graph reasoning.
Bio: Zhaocheng Zhu is a final-year Ph.D. candidate advised by Prof. Jian Tang at Mila – Quebec AI Institute, University of Montreal. His research interests include reasoning, knowledge graphs and large language models. His works, among the first to study inductive generalization across structures, have led to a paradigm shift away from traditional knowledge graph embedding methods that have been used for years. He gave a tutorial on knowledge graph reasoning at AAAI 2022. He is also an active developer of machine learning systems, and led the development of two open-source libraries, GraphVite for large-scale embedding training and TorchDrug for drug discovery research.