A Particle Swarm Optimization Approach for Reuse Guided Case Retrieval

A Particle Swarm Optimization Approach for Reuse Guided Case Retrieval

Nabila Nouaouria, Mounir Boukadoum, Robert Proulx
DOI: 10.4018/ijssci.2014070102
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

The success of Case Based Reasoning (CBR) problem solving is mainly based on the recall process. The ideal CBR memory is one that simultaneously speeds up the retrieval step while improving the reuse of retrieved cases. In this paper, the authors present a novel associative memory model to perform the retrieval stage in a case based reasoning system. The described approach makes no prior assumption of a specific organization of the case memory, thus leading to a generic recall process. This is made possible by using Particle Swarm Optimization (PSO) to compute the neighborhood of a new problem, followed by direct access to the cases it contains. The fitness function of the PSO stage has a reuse semantic that combines similarity and adaptability as criteria for optimal case retrieval. The model was experimented on two proprietary databases and compared to the flat memory model for performance. The obtained results are very promising.
Article Preview
Top

2. Case Retrieval In The Literature

Several memory organizations can be used for the purpose of case retrieval and a different search algorithm is associated with each one. These may be divided as follows.

Flat memory/brute force algorithm: cases are stored sequentially in a simple list, array or file (Kolodner, 1993). Cases will be retrieved by sequentially applying a matching function to each case in the file and keeping track of a matching score; the case with the best match is returned. There is no particular organization of the cases and the retrieval algorithm is quite simple since the matching heuristics do all the work. The major advantage of this memory organization is that the entire case library is searched. As a result, the accuracy of retrieval is only a function of how good the match function is. Moreover, the addition of a new case is not expensive, but the search process becomes costly when the base is too large. To remedy this disadvantage, parallel implementations have been used (Kolodner, 1993).

Structured memory/index based algorithm: Here, the CBR memory is enhanced with various generalized structures such as concepts, prototypes, and abstract cases. The accumulation of generalizations or abstractions facilitates the evaluation of the situation, and allows control of indexation. These structures can be organized via conceptual hierarchies, decision trees, object oriented taxonomies, formal concept lattices, and B-trees (Bichindaritz, 2006). Also in this category, we have Shared-Feature Networks (Kolodner, 1993) where cases presenting similarities in the same cluster are gathered and hierarchies are formed as the clusters are subdivided into smaller clusters and, dually, Discrimination Networks where a discrimination of cases is made as a side effect of clustering in shared-features networks (Bartsch-Spör & al., 1999).

All the memory models that use a top-down search share two desirable features (Lenz al., 1998): data structuring by regrouping related objects and efficient retrieval by traditional tree search algorithms. Unfortunately they also have potential limitations, of which memory traversal by answering an ordered sequence of internal node questions − in the case of incomplete information, this could lead to erroneous paths − and difficult access to neighboring clusters with similar cases when reaching a cluster at a tree leaf. Two notable exceptions in the category of index-based approaches are the Fish & Shrink model (Schaaf, 1996) and the CRN model (Lenz al., 1998).

Complete Article List

Search this Journal:
Reset
Volume 16: 1 Issue (2024)
Volume 15: 1 Issue (2023)
Volume 14: 4 Issues (2022): 1 Released, 3 Forthcoming
Volume 13: 4 Issues (2021)
Volume 12: 4 Issues (2020)
Volume 11: 4 Issues (2019)
Volume 10: 4 Issues (2018)
Volume 9: 4 Issues (2017)
Volume 8: 4 Issues (2016)
Volume 7: 4 Issues (2015)
Volume 6: 4 Issues (2014)
Volume 5: 4 Issues (2013)
Volume 4: 4 Issues (2012)
Volume 3: 4 Issues (2011)
Volume 2: 4 Issues (2010)
Volume 1: 4 Issues (2009)
View Complete Journal Contents Listing