Concept Induction in Description Logics Using Information-Theoretic Heuristics

Concept Induction in Description Logics Using Information-Theoretic Heuristics

Nicola Fanizzi
Copyright: © 2011 |Pages: 22
DOI: 10.4018/jswis.2011040102
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This paper presents an approach to ontology construction pursued through the induction of concept descriptions expressed in Description Logics. The author surveys the theoretical foundations of the standard representations for formal ontologies in the Semantic Web. After stating the learning problem in this peculiar context, a FOIL-like algorithm is presented that can be applied to learn DL concept descriptions. The algorithm performs a search through a space of candidate concept definitions by means of refinement operators. This process is guided by heuristics that are based on the available examples. The author discusses related theoretical aspects of learning with the inherent incompleteness underlying the semantics of this representation. The experimental evaluation of the system DL-Foil, which implements the learning algorithm, was carried out in two series of sessions on real ontologies from standard repositories for different domains expressed in diverse description logics.
Article Preview
Top

1 Introduction

Formal ontologies are likely to play a key role in the next generation information systems moving from legacy to (linked) open data whose semantics is intended to be formalized and shared across the Web (Staab & Studer, 2009). One of the bottlenecks of this process is certainly represented by the construction (and evolution) of the ontologies since it involves different actors: domain experts contribute their knowledge but this is to be formalized by knowledge engineers so that it can be mechanized for the machines.

As the gap between these roles likely makes the process slow and burdensome, this problem may be tackled by resorting to machine learning techniques. Ontology learning (Cimiano, Mädche, Staab, & Völker, 2009) is intended to provide solutions to the problem of (semi-) automated ontology construction. Cast as an information extraction subtask, ontology learning has focused on learning from text corpora (Buitelaar & Cimiano, 2008). The main drawback of this approach is that the elicited concepts and relations are represented with languages of limited expressiveness. A different approach can be based on relational learning (see De Raedt, 2008, for a recent survey), which requires a limited effort from domain experts (labeling individual resources as instances or non instances of the target concepts) and which leads to the construction of concepts even in very expressive languages (Lehmann, 2010).

If the concept learning problem is tackled as a search through a space of candidate descriptions in the reference representation guided by exemplars of the target concepts, then the same algorithms can be adapted to solve also ontology evolution problems. Indeed, while normally the semantics of change operations for this task has been considered from the logical and deductive point of view of automated reasoning, a relevant part of information lying in the data that populates ontological knowledge bases is generally overlooked or plays a secondary role.

Description Logics (DLs) is a family of languages supporting the standard ontology languages designed for knowledge bases in the context of the Semantic Web. These logics constitute specific fragments of First Order Logic (FOL) that differ from the standard clausal languages employed in relational learning, namely they have a different syntax and especially very different semantics (Borgida, 1996; Baader, Calvanese, McGuinness, Nardi, & Patel-Schneider, 2007). This motivates the growing interest in investigating inductive methods for such new formalisms.

Complete Article List

Search this Journal:
Reset
Volume 20: 1 Issue (2024)
Volume 19: 1 Issue (2023)
Volume 18: 4 Issues (2022): 2 Released, 2 Forthcoming
Volume 17: 4 Issues (2021)
Volume 16: 4 Issues (2020)
Volume 15: 4 Issues (2019)
Volume 14: 4 Issues (2018)
Volume 13: 4 Issues (2017)
Volume 12: 4 Issues (2016)
Volume 11: 4 Issues (2015)
Volume 10: 4 Issues (2014)
Volume 9: 4 Issues (2013)
Volume 8: 4 Issues (2012)
Volume 7: 4 Issues (2011)
Volume 6: 4 Issues (2010)
Volume 5: 4 Issues (2009)
Volume 4: 4 Issues (2008)
Volume 3: 4 Issues (2007)
Volume 2: 4 Issues (2006)
Volume 1: 4 Issues (2005)
View Complete Journal Contents Listing