Enhanced F-Perceptory Approach for Dealing With Geographic Data Imprecision From the Conceptual Modeling to the Fuzzy Geographical Database Building

Enhanced F-Perceptory Approach for Dealing With Geographic Data Imprecision From the Conceptual Modeling to the Fuzzy Geographical Database Building

Besma Khalfi, Cyril De Runz, Herman Akdag
DOI: 10.4018/978-1-5225-7033-2.ch019
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

When analyzing spatial issues, it is often that the geographer is confronted with many problems concerning the uncertainty of the available information. These problems may appear on the geometric or semantic quality of objects and as a result, a low precision is considered. So, it is necessary to develop representation and modeling methods that are suited to the imprecise nature of geographic data. This leads proposing recently F-Perceptory to manage fuzzy geographic data modeling. From the model described in Zoghlami, et al, (2011) some limits are relieved. F-Perceptory does not manage fuzzy composite geographic objects. The paper shows proposition to enhance the approach by the managing this type of objects in modeling and its transformation to the UML. On the technical level, the object modeling tools commonly used do not take into account fuzzy data. The authors propose new functional modules integrated under an existing CASE tool.
Chapter Preview
Top

Introduction

The considerable development of geographic information, whether in a professional context or for public use, forces researchers to reconsider more seriously the issue of data quality whose impact has a direct influence on the reliability of spatial analyzes produced and the resulting decisions. Geographic information is characterized by the location data in a Geo-referenced space. Each geographic entity is characterized by its shape (geometry) and location (spatial coordinates). Spatial representation, querying spatial database or any spatial analysis use these entities as variables or criteria. The quality of spatial data is a major concern of geographers. It is measured through criteria like geometric precision, completeness, semantic precision, logical consistency, timeliness, etc.

Standards in quality of geographic information consider two types of quality: internal and external quality. According (Devillers, 2004), ISO 19113, 19114, 19115, 19138 and 19157:2013, the internal quality concerns the accuracy of data from the real world, while external quality expresses the ability of the product to meet the particular requirements of a user. In general, the quality of spatial data is based on five components:

  • 1.

    Genealogy,

  • 2.

    Geometric precision,

  • 3.

    Semantic attribute accuracy,

  • 4.

    Completeness, and

  • 5.

    Logical consistency.

Genealogy contains descriptions of acquisition processes and derivation methods, including all transformations leading to the final result (reference). Geographic precision gives an idea about the position differences between database objects and real objects. The semantic precision compares a measure for a spatial attribute with another measurement. The semantic consistency refers to the relevance of the meaning of geographic objects, than to their geometric representation (Salgé, 1995). Completeness describes at what point all entities of a data set represent the wholes geographic items that exist in the study area. The logical coherence measures the level of conformity of data for all the structural characteristics of the data model (integrity constraints attributes and topological constraints).

These components define the first orientation of the management of the quality of spatial data that concern the quality measurement; the second orientation of the management of the quality covers spatial data modeling where several methods and tools developed to present or to model or to analyze geographic data are often in challenges with the real needs of the Geomatic community (Fisher, 2003). This vision is oriented on the question on how to handle, represent and analyze data imperfection. The two orientations converge in order to produce either good quality data or a better closeness of observation data to the complexity of the world. The work presented in this chapter is developed in the second quality management approach that is to allow using data in their nature without simplification or subjective projection experts.

In this perspective, it is important to consider the geographic information in its imperfect (imprecise, uncertain, etc.) nature, study it and integrate it into the analysis process. To represent the imprecision and spread its treatment from modeling until its manipulation in databases is therefore a main goal to reach. This leads to adapting the classical representation methods, storage and processing as well as a special consideration within models using this data.

Indeed, classics design methods define entities, attributes, data, and relations in a crisp form. In most cases, using the simple Boolean algebra method and the set theory, data are retrieved or combined and a result a considerable quantity of useful information is lost. In order to have a very reliable information system, two alternatives are possible: either one may limit the information system to a portion of the real world for which information is available and can be considered as reliable and so losing much useful information, either one may define an information system that takes into account the imperfect information. This chapter considers the second approach and mainly deals with imprecise data.

Complete Chapter List

Search this Book:
Reset