LAMA-WeST Lab

Artificial Intelligence
Natural language processing
Semantic web

Improving knowledge graph embeddings by learning vector representations of logical rules

Student: Félix Martel

Supervisor: Amal Zouaq

Knowledge graph embeddings provide an efficient and convenient way to handle large knowledge graphs and to get dense, meaningful representations of their entities. Yet, most of these embedding models are trained using only the entities contained in the graph and thus don't take advantage of the underlying ontology. In this work, I try to incorporate logical axioms into embedding models. One of the goal here is to find a model that can compute vector representations for both entities and logical rules, such that the geometric properties of the embeddings reflect the logical relationships between the elements.