From GPT to Mistral: Cross-Domain Ontology Learning with NeOn-GPT

Tracking #: 3859-5073

This paper is currently under review
Authors: 
Nadeen Fathallah
Arunav Das
Stefano De Giorgis1
Andrea Poltronieri
Peter Haase1
Liubov Kovriguina1
Elena Simperl
Albert Meroño-Peñuela
Steffen Staab1
Alsayed Algergawy1

Responsible editor: 
Marta Sabou

Submission type: 
Full Paper
Abstract: 
We extend our previous work on NeOn-GPT, an LLM-powered ontology learning pipeline grounded in the NeOn methodology, by introducing methodological enhancements and broadening its evaluation across multiple domains and language models. We apply the pipeline to four diverse domains, for each domain, ontologies are generated using proprietary (GPT-4o) and open-source (Mistral) models. Evaluation is conducted against gold-standard ontologies using structural, lexical, and semantic metrics. Results demonstrate that LLMs can produce ontologies with high relational expressivity and partial conceptual alignment, though performance varies by domain and model.
Full PDF Version: 
Tags: 
Under Review