Fuzzy Constraints for Knowledge Graph Embeddings

Tracking #: 3134-4348

This paper is currently under review
Michael Weyns
Pieter Bonte
Filip De Turck
Femke Ongenae

Responsible editor: 
Guest Editors NeSy 2022

Submission type: 
Full Paper
Knowledge graph embeddings can be trained to infer which missing facts are likely to be true. For this, false training examples need to be derived from the available set of positive facts, so that the embedding models can learn to recognise the boundary between fact and fiction. Various negative sampling strategies have been proposed to tackle this issue, some of which have tried to make use of axiomatic knowledge claims to minimise the number of nonsensical negative samples being generated. By putting constraints on the construction of each candidate sample, these techniques have tried to maximise the number of true negatives outputted by the procedure. Unfortunately, such strategies rely exclusively on binary interpretations of constraint-based reasoning and have so far also failed to incorporate literal-valued entities into the negative sampling procedure. To alleviate these shortcomings, we propose a negative sampling strategy based on a combination of fuzzy set theory and strict axiomatic semantics, which allow for the incorporation of literal-awareness when determining domain or range membership values. When evaluated on benchmark datasets AIFB and MUTAG, we found that these improvements offered significant performance gains across multiple metrics with respect to state of the art negative sampling techniques.
Full PDF Version: 
Under Review