Theory of Ontology and Meta-Modeling and the Standard: An Enabler for Semantic Interoperability

Theory of Ontology and Meta-Modeling and the Standard: An Enabler for Semantic Interoperability

Keqing He (Wuhan University, China), Chong Wang (Wuhan University, China), Yangfan He (Wuhan University, China), Yutao Ma (Wuhan University, China) and Peng Liang (Wuhan University, China)
DOI: 10.4018/978-1-60566-731-7.ch007
OnDemand PDF Download:
No Current Special Offers


With the continuous development and rapid progress of information techniques, complexity and scale of information systems are expanding increasingly, which consequently brings our research focus on how to ensure the effective exchange of information and efficient interconnection between each part of a information system. However, different modeling paradigm, languages and platforms cause grammatical and semantic diversity to the existing information resources, a challenging approach to information resources management is needed to promote deep sharing of those information resources and implement rapid integration based on them. To realize semantic interoperation between diverse information systems and resources, the authors combine meta-modeling methodology in software engineering and ontology from philosophy to exploit a novel methodology named Theory of Ontology & Meta-modeling. Based on the methodology, the authors contributed to an international standard project ISO/IEC 19763- 3: Metamodel for ontology registration since 2003, which was officially published as an international standard in Dec, 2007. Furthermore, we developed a Management and Service platform for Semantic Interoperability on Manufacturing Informationalization Software Component Repository (SCR). It can support ontology-based software component attributes classification, registration and management using ISO/IEC 19763-3 standard, and implement semantic (ontology) based software component query and retrieval. Based on above mentioned techniques, this platform can facilitate the management of semantic interoperability, which provides the reliable infrastructure for the reusing and sharing of heterogeneous software component resources.
Chapter Preview

1 Introduction

Requirements elicitation process is one of the challenging processes in the software development methods. In traditional software development methods end users or stakeholders predefined their requirements and sent to the development team to do analysis and negotiation to produce requirement specification. Traditional software development has a problem to deal with requirement change after careful analysis and negotiation. This problem is well tackled by the XP, which is one of the agile software development methodologies.

Extreme (XP) programming is a conceptual framework of practices and principles to develop software faster, incrementally and to produce satisfied customer. It is a set of twelve practices and four principles, which makes XP successful and well known among all the agile software development methods. The goal of XP is to produce the software faster, incrementally and to produce satisfied customer (Beck, 2000). According to Bohem (1998) the cost of change grows exponentially as the project progresses through it lifecycle (Bohem 1981). The relative repair cost is 200 times greater in the maintenance phase than if it is caught in the requirement phase (Faluk, 1996). XP maintain the cost of change through iterative software development methods and Refactoring.

While CMM and CMMI or software process improvement has gained a lot of attention during the last decade. Due to the increasing competition in the software market faster delivery, high quality products and customer satisfaction are the major concerns for software organisations. A quality process can have a positive impact on services, cost, on-time delivery, development technology, quality people and quality of products (Zahran, 1998).

Getting requirements on story cards right continues to be a universal problems same as the requirements problems in the traditional methodology. Story cards errors can be costly in terms of low time, lost revenue, loss of reputation and even survival (Beecham, et al., 2005). A critical aspect of the requirements process is the selection of the an appropriate requirements set from the multitude of competing and conflicting expectation elicited from the various project stakeholders or from an onsite customers (Wiegers, 1997).

Looking at methods of CMM for process quality, measurement and improvement they tend to cover the area of requirements engineering poorly. It covers the area of requirements engineering inadequately. CMM does not cover how the quality of the requirements engineering process should be secured or what activities should be present for the requirements engineering process to achieve a certain maturity level. Some time it is really difficult to assess the maturity of a requirements engineering process for a certain projects, and it is difficult to know what is not addressed or what could be done to improve the process.

Key Terms in this Chapter

Reference Ontology: ontology that is usable and sharable by a community of interest.

Semantic Interoperability: the ability to exchange and use information between two or more entities.

Universe of Domain: all those things of interest that are concrete or abstract and that have been, are, or ever might be.

Ontology: description of a universe of domain in a language that a computer can process.

Local Ontology: ontology that is specialized for defined applications and based on at least one reference ontology.

Metamodel Framework for Interoperability (MFI): framework for registering artifacts that are based on meta-model and model.

Meta-Modeling: a methodology of how to extract common information from models and create a meta-model.

Complete Chapter List

Search this Book: